var/home/core/zuul-output/0000755000175000017500000000000015134167275014540 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015134204475015476 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000331433615134204324020261 0ustar corecoreqikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD P ?"mv?_eGbuuțx{w7ݭ7֫t% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿-h QZ*U1|t5wKOؾ{mk b2 ܨ;RJK!b>JR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߾)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN#ap6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?w:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&wfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓI+mj(^>c/"ɭex^k$# $V :]PGszyH(^jJ=䄸-m!AdEږG)շj#v;#y/hbv BO Iߒ {I7!UՆGIl HƗbd#HAF:iI }+2kK:Sov3b:1)'A6@\2X#Ih9N ̢t-mfeF;gUаQ/ .D%ES*;OLRX[vDb:7a}YF30H #iSpʳ]'_'ĕ -׉6tfЮ$zͪO_sYq+q艻*vzh5~Yy;,DiYTP;o./~^.6+zZFD& m@WXe{sa 2tc^XS?irG#^ŲDI'H_Ȯ;RJ&GT.Kwj;of¬zHmmS2ҒN'=zAΈ\b*K ڤUy""&D@iS=3&N+ǵtX^7ǩX"CA⥎å+4@{D/-:u5I꾧fY iʱ= %lHsd6+H~ Δ,&颒$tSL{yєYa$ H>t~q؈xRmkscXQG~gD20zQ*%iQI$!h/Vo^:y1(t˥C"*FFDEMAƚh $ /ɓzwG1Ƙl"oN:*xmS}V<"dH,^)?CpҒ7UΊ,*n.֙J߾?Ϲhӷƀc"@9Fў-Zm1_tH[A$lVE%BDI yȒv $FO[axr Y#%b Hw)j4&hCU_8xS] _N_Z6KhwefӞ@蹃DROo X"%q7<# '9l%w:9^1ee-EKQ'<1=iUNiAp(-I*#iq&CpB.$lٴާt!jU_L~Tb_,֪r>8P_䅱lw1ù=LAЦz38ckʖYz ~kQRL Q rGQ/ȆMC)vg1Xa!&'0Dp\~^=7jv "8O AfI; P|ޓܜ 8qܦzl5tw@,Mڴg$%82h7էoaz32h>`XT>%)pQ}Tgĸ6Coɲ=8f`KݜȆqDDbZ:B#O^?tNGw\Q.pPO @:Cg9dTcxRk&%])ў}VLN]Nbjgg`d]LGϸ.yҵUCL(us6*>B 2K^ sBciۨvtl:J;quӋkKϮ듃ԁ6Y.0O۾'8V%1M@)uIw].5km~Ҷ綝R(mtV3rșjmjJItHڒz>7!ېATJKB2Z/"BfB(gdj۸=}'),-iX'|M2roK\e5Pt:*qSH PgƉU'VKξ ,!3`˞t1Rx}fvvPXdQSg6EDT:dׁz^DjXp͇G|X5Q9K$)U?o': .,wؓaՁ_ 3]Q16ZYafuvrq^ѷQT},!H]6{Jw>%wK{)rH+"B4H7-]r}7v8|׾~Us?yWfv3>xpRҧH-EeJ~4YIozi:nq Vq8swHOzf ̙eX-4`TDGq G.tݻgq74ŠqBFf8 9Fk Afq#ϛa$!qNCJ4bnvB @W,v&- 6wCBjxk9ᤉ ,Asy3YޜZ4ΓVYf'h?kNg?҆8oC!IMo:^G10EY↘H:L@D+dˠUHs[hiҕ|֏G/G`' m5p|:9U8PZ7Yݷ/7cs=v{lLHqyXR iE^1x5/[O6rpP40ޢE_A͝ Z5 om2p)lbp/bj_d{R\' 礅_}=\:Nb{}IStgq$<$ilb)n&  $uT{wD]2cM(%YjDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>&IL6i}^^XpCŋ݃k-$pxbڲ&6*9mg>{rtD)wQ`pkKyt1?[ˋZ5NhfӛŮ Qu8Y4?W֫/&W˸~%pqq{% ?K~,#/0'NZ׽Kq^ėSJ6#j8GO[ PCbʍN^XS&}E9OZ]'t$=tnn&nu [}Ab4 +OLuU{0fIb { O;6q6^9.EPHŽ{pN>`cZV yBJHVuV_K2k*`cKxuBG&24T}Lai 0Va(7K#ӊ!,ZDxFQO*lם>!4ӥ2 ]8â6 U`V%`!c%؎ʨTzrKh! c.}.D>)d_ 8rcu,wf2?Ǡ*_lDn}rauyFp*ɨ:UiM2r:9ct X1lmĪ o玓,R%!`hGT LYF#g<cm${|Xdu4tmtїUJ\~dc0KcMlf2?mμQ ߉J4WrSHTdp"ӹ'cJq2zPlX̯.0H!ND@UapVoGڧD5>H]f@!=߸2V%Z 0"G4ȇʩ@]>Y$ًF_Mm_Tt)ib+q&EXFu򾬳ǝ/RS>r,C2NfOjpcm{Ll9vQOT>9U;])>6JdbXԠ `Z#_+D[7IIjJɟUh ҙ"`"a ߒ"G̾H`6yiCk(OA/$ ^%K^+(Vr[RR1"u4A.1X0=7f/"(o9/L1X{]q`Ȝ/; 9a>E)XOS K9mUxBa"'4T[Jl /K/9,rlCAj_TiǘP,:4F%_0E5IE'rX-|_W8ʐ/=ӹjhO%>| :S Px„*3_y.g9| ;b`w NtZtc> ײ1KĴ{3Gl& KT1ZWX8?C]~We$9; -.D087?1a@P5B,c}jcGȱ WW/ @a#LA4.ٹ^XڋXٝ:^Izq. ٽƎDn6ٹBc5Lt;3#i3RAٽ9| cbpcTfp> 6L/_x 'ۙz7~w~);qU9GDT! 6]c_:VlnEUdn6UˇKU;V`JUݵޙEO[)ܶCy*8¢/[cչjx&? ՃJȚ9!j[~[' "ssTV2i sLq>z@JM->=@NỲ\쀜*/) ̞r21.y? bO]3?C!yw3ޯL_Su>o>&lrw&i"< :]_<<7U_~z5є/rfn͝MLmc 6&)e+n7cyy{_~궼07R7wPuqpqo{ߟ+[w_uOq?u-|?WS_tOq?Eu-L_p?Cz .e ϿO*3 `Ђ6a-`kIf-s,RL-R`1eL~dپ&+IhYRczr?㐟,v~,b6)up)3K,RLW"Qd9JgT\1f3@Kh% a4x,kA k ^d kYj5Ah𚄓vXZhX1xҖ51Y +Id ZZ\C| fD>hB֡#-$+Jpሟ,Cg:6 3 xH "}C[`ӨOAFn5ʬLHϰ:N@VcyBI#Dr. "h hg ۃm-qu>V&൘ G7qi#^tҒ[JI!{q*lrD܇Gk@;oI<5xZ4xM"؇'k!>V|lk'{d+ :sXӄc)?W`*|\v aVT0"tMًcΒVz]T.C$cEp._0M`AlF̤@U' u,—rw=3}resLV&ԙy=Ejl1#XX۾;R;+[$4pjfљ lݍ3)`xvcZRT\%fNV Q)nsX }plMa~;Wi+f{v%Ζ/K 8WPll{f_WJ|8(A ä>nl"jF;/-R9~ {^'##AA:s`uih F% [U۴"qkjXS~+(f?TT)*qy+QR"tJ8۷)'3J1>pnVGITq3J&J0CQ v&P_񾅶X/)T/ϧ+GJzApU]<:Yn\~%&58IS)`0効<9ViCbw!bX%E+o*ƾtNU*v-zߞϢ +4 {e6J697@28MZXc Ub+A_Aܲ'SoO1ۀS`*f'r[8ݝYvjҩJ;}]|Bޙǖߔ 3\ a-`slԵ怕e7ːزoW|A\Qu&'9~ l|`pΕ [Q =r#vQu0 M.1%]vRat'IIc(Irw~Z"+A<sX4*X FVGA<^^7 vq&EwQű:؁6y\QbR9GuB/S5^fa;N(hz)}_vq@nu@$_DVH|08W12e_ʿd{xlzUܝlNDU j>zƖݗ&!jC`@ qэ-V Rt2m%K6dX)"]lj齔{oY:8VmS!:Wh#O0} :OVGL.xllT_oqqqLec2p;Ndck[ Rh6T#0H Q}ppS@ώ@#gƖ8sѹ e^ CZLu+."T#yrHhlكʼE-X'I^=bKߙԘ1"+< gb`[c1髰?(o$[eR6uOœ-m~)-&>883\6y 8V -qrG]~.3jsqY~ sjZ+9[rAJsT=~#02ݬf¸9Xe>sY~ ae9} x* zjC.5Wg󵸊y!1U:pU!ƔCm-7^w]斻~[hW$k sE0ڊSq:+EKٕ|dvvjjy6 æ/ML-yz,ZlQ^oAn-})xǺǍ--qcl:WLg ӁvJ[ǧc~Of+8qpçco#rCtKӫce0!Y-+cxMK-H_2:Uu*corD~@N`#m~R:ߙ歼!IZ5>H;0ޤ:\Tq]_\_>e˲\oUQ\Wߋ47WwߋKpwSSۘF,nC.\UߋoVEuY]^VW0R=<ު˜˻ x}[ێ'|;c^ M7 >5\-> m-8NJ\ALd!>_:h/NAC;?_ξqĎ6xMY(=ͯl~l8V0٨T zL{Ac:&$ ^CpH*DW\r2aR|=(L X1|wrO_g ux1^^V2޲jMi^b``Q#dBxV#NBk1;DAV$"*1]Y~ d->'I`{W6/%x dNh׼CoIULX`˒*V:(Kw>R#j0F})R]쫑ddg,/9j h`_ <*("*]if[hc_ 1<Ζ{=5~]渹Օu?Wܞ_~CW` ![3@$S<}5EXv["-Vs%'vK8g۞#d)4<`/^SS9WnMLkb6SfAoa#uDa4KX>Y/R JM!}Jn3CxuywaXcC\Rͣ?:jWX#(yi2>2M v @ K{ҟ=ћoďt6TݱK{*eBwWl ԁKlQ ȏ \ƣyUYPN*V^)-1uu2 bE)^񓵃XIlhbEq3goYX~xgǨB$FFa]@~QNGi,zz^]۰(a3rapdN̓W?<-,~v=j6yt2g{jhx5yd8qr%n=":FW*fq35 itxZP0Ƌ}G|loZt.+DEU8<<}L|̸hgNtC$*7F5oU6[FMaUԘ6U[W4!y,v4y`W>Mq՜Fhw=I frfR.):Ƿ*˸ݎ~@>uēcf,@WUr}<)qq 7.xoF+Ӷ2D?2oME',>(&6uU}z#ܓ75$Fev h"IO"6K!~o3bo,SDA˪V GmH4f1Rp9Xeh7V+pt\N#aQ+T3%#<$JAImP=(Bj梢ΫuIBXFy=E)8;f5lZ% ^w:=v[4Fbj"C=ݚ0h*߫`S.fQ|~d2^PowOҜg|U^g4RN) īkT]E}_du>MY^γFŠTx?lVH8sm{%tF_M#z{z.+>~xs/f2JU=Oatz&@*݇G8Gq m3dy-+:-Y(p#iP4\G&e$K`ODt$yw[]M(qD8WkD]T<.BDA n?<1PEVt!Ã-=WDp)_CQEU껳ӻ| 2r,7{&y KO pY2O]DFETJrcG(*d>yF2x[eyT{.#Ig(ڗ!tk+T(_4$(#px_`kD4[\Y0 Զʻ&IUңҶ˰wU4K6q`؏}"oס!/s< 0ʬy9J$nK%@̢:SqK}1:<#:yYnʁoO4_= 4B%!76새-%e(st S1Cd7kKQ:#T77m=JWܟ|( oDeɉZa_N%$j?UɚrxImm(nM2נ7WXUAp`jmE `@)s4}}nlې@ҝ`}ih20+[7Ulc^7]2k. +,mѾPm `G\C*U ,ڼ,-wK0 -uc%Tm{K+u!`id'W,GݒA1v1e6vU zd%aq26M/D4+Z+q@ #y(j@ QenX^\Aݒgc[Iخ,)D0mM2n1QV; Zjss!h=mޕ@/rֶ<͕jvt8ήnD\oE7%F.*Hډ^4cyWFjJk.Tekcd]Y`U!%>e@[]JF0GI2)\]ia\9v*.1{<,Y hr8趔e붎,5"R q1L9l|3J"+u= /O0 _4?Ȋjk#Vpv/U2hke)rSuo/W ,h+i7nol+ZNDZ%MT D}C8Z(eELbk sEd׏RTGrl-1!rP\X8RIެ흀m[뗠 WdޚGꚱ++}RZre3W 6:yӮ+ۏ,%FTk>lw )X#>mK}b1sx{>0=־?^ww"<Ȉ:ԷlNM7.yT̕-o9g --WV/̲D)\`<#J\+ "N\YۣlД2$ reaUbrt~my-tya lgaZzn!5q\3 #p]=Nw \ֈ'8Wf9peaZOR#֖DPpFlÕ܈Zrea׬:?[,OV0ve5uY"(X,unw]s7M[Dv ;@׵d' wxmɖVӼV=Mpe~ߓ-Mqe#80ãK킋{<n۱xٞN4`9Rf>iKHKEU]lm\izEVW=$~[ Ȓ|XwKaި-d8㸏gIj󫕏0ުf|7xSAwp`FA{3 򵫺6$9+˫ wA;ړKeXOBvqc.O3ivVXܣkڕZ.`&洽 Xܸ4sG[.V(v]Ydq읞aCPW"*agѫ.Lk^P?[/*H)oʲʶ+4`^uDFӳ*۠0"YBÒKE=Ugq6[~PeG9Vf- -G$^-gLk%O*;gH2,$Plf5WV`[7[݊[f(l69yޝLXz'7~(pr`CA]*s4tUf`fZ` 5kPn.3Sxo  s= 6 ^ 5K: Tp-00px)vWI`56@HDC\kk*<ĆNhckfajCOzh}72Rހ:11>iw@};OyFC3zjFcaSa`:nu2N9d)ɾKdR oM['p@FJS$Ѝ:F*` Þa}kxbpw?]]e |WnD&(8=='HKsb BZ?> =KXdetzrLh9D Ks#뿟Mڏ+|~@H?drvscVNI#b aZG/&#JЫOhN݂?*&pԐ>=˞Rz?q\K8C? ܭO'vNY#Ɏ!i]Gvȼ^9+^Ʉw%1y{V^"x(zVWqWQ&)t,N M'G<׸G#S׵~a?E #Ja )DkղYU7Ԋ6 4Os49{N:q~ Ӹ~bS5TNDqC, D:~g\ѡC:ѭ@`oKX/2A~x{g>>pp*}I!dT@(a?X 5Cz 84) A9@(#QѶzK #mޖ//^g>žs({ckpԗR4ĉ|t0y:҄a!?txXA/gYX?Fg%4ѨJNZ {#fNHauhzY=K 3n Y7ā^xYf_y"i٦ˋ<;r`".t|3mZ# !}łN81rF맣tkv*$\|2ژg#1,`"uf"2T1/ElVfj:ռPf59a$YG4@\ |H y3 BHB[oP!ՓW n-ql&6M9vbfve@z4JUrMM ;Gh,Q,f^=cÆc?tLЄarұe-| =ԃ>IFg N`I@~9ng(KXV""~6Tl>@YHKzM(0j"'p7WWB޽ZܦZ.%F V(ZݺSCpm*pVA#*O{.&*5^=ħύo(a׋H_eüEf{4 BϲliY~‘" >askx锫|owx9feu 8\j̵ KgףbRF0(rXw_]g^ ";c[ͻZnA]Z~0&͘oN7zwX'YBNBK`4fV 4vaЌG'u ʲ=XPec RX윉c~+ R*Ãj")5+K#۷6쾂A{XXR+ j)[qW0ׂ߱{4-Kj[JSgvRL&q"F5\$& 8 ?"Jja g n"e|2Wy0]V-)Qw|z_9_lT>[]]= ہK&,S{>[NzcwtSѝK;kN4RGRj@N:O#y$넺;N4BG@NzO#{$uB eO#=P:?P;@h; Ih:>𑄆]Fa o+H5)p'(&[uGQ,oV+΋dO;@B=u, \h=7Z'$q<*Y/ea!_FLސCtmYC{Msw5`P:MyU tFU^g]^,rU)qi0 !ړ-ZgZ Wb{N@Wz5P|ȧ*@ѐg yU&#,T {-B$gu*/CGn%M4[nޯ -Nm5]H=U Wouct:K{};a|^4+RϒPEWAIwki8I5'z'(O '[2gZNqmO7"3 G$ Q^V?>ǖR+`}㭻>PV0OLW$^NeSXI heECw[gP񺘔Ua?HW<\1/S^zFT,ӻ| mٖ5e6˪xj xQf ``kOpˬ/)ۭ3iM0I=A*~m?^\5.9 X< `Dc -Z~[IhQ^d4go=vׅ@ % i\0`;` cȱ]?Yxx&\?&V w>(i9e`EYvj8AvW =izw{ MGrbJqY+k]..Lp)Ox=bj;뙔, gu9qձ:5EE\Fu{xu,GŔrK |_ 0-co< 9^o}U([!:JaUcs.rt库x@! ӞqE".ߑP/*9O S,,::lv˚)7@_R5s9I}S$\;0k yM)gAˈ;^x0' !Rs7h?:G\| 26T)S)狈kT n+| .$ ÀhX1<PKyTM|[HwTyQrAG5FvZ[SX\ѡ5lAvEW?>8MO3iUa""fӿ9pagY?T2F{eJz >fn6l^)iZf)U=P!sTu:t3au;"_q$_WPm~q;d\ *A`/=>^㣺zM^`(Oߓ/~OlI<%~E{ ^س&rسvw[E<-ރY72ZcdpN3`Ob()t-XvC|ʘv,&v"ɵ3>ؼ*Kυ_4-qK8 tn@2V֦~+v3-쎲+QRZ{2wjYhll|(NqpKx\!'Bʥ?v`T묹JM $ a:TId|w?<,+`DROeIvj7;?XrZ$U0S9ި)ݛD5<Ԡs< >kF#-aqDHtG#{=^ fV²P5.|=lKz~,΁0-EoւRk 9gXСgBue N1moǦn2WZ+YK=PkY(`iE71a!A;Nc% |a>]#k'!o';[L-]Q&Cp r•.&;G'yEL+)!ІU=%o12Fe j,يƒc`2%GgI [ eb$$ع]gt7V(F˫E%7hz}.h1/}0 ́X H͒щYQc-H"&*o|*:\S? $ = *ť&+ R*z<ʨ*j,fZuZX^6 w6'8~JyYw'ňc8:\*8 TA޷P\: KM#͐R vk)VRJR1]Lї"=b$՛ޓPj]Q2myf7reB/p!FYj5 $ˆI0%`XS\ZZ@ppSHXX7wA;AӪqco`r5,Z+TPY0BJZ}6~Ch#^ }BT݄! IIg(*&sL@\ I*nu[NVl,BM,찈f@V?кEk5|VÞlfDVAu\ =棳˚mNY$cV-!FLoP7"6惽kvp7Cvs?$3&bfē 3MLW9"Au{\6Oo$8*usGy6!&Pϋ>!֝fGzcYU<, f^ U#hbp7x4|:4uRIDiNlS*%8]?* 'nXjĔVOnUZ- 6~aX* һ[f:sP8U[iF ӓ[ܯX@7Dչr>=V-uƠи}J۝j0u/+>Q\k5ڔ)]SW`Ѥ tEMfn}KB&Vta??GIc5yBri D>dToM&I-/OG1/(ïPbgE"q03&}Dk;+JF+ fh-'CIґ-O0ZKL0RT`HH,d0DC+)+a$ nN೥Q;3zzOƼ(&WUҚ%\J+9Ze鞈ѵp[G2`$]0+$NNTr,̶H:X3x4d0֐.3hF΋xyx$q05|uՊt'u%}H#$<r펇SSf%#<g[əfԁqvV^3dz;EM\leɿ4*׆p]uXȵ}dU?>2&8حm"1,V̋+~BuyL47O J[efLb3C'\v^{<0Co"v :*M̏ 笠Y1M:ipd!"Zc)rsSGg-GtEגQE1R0u:0xAo+6CwXgA.ۨIl[Fߦ%hp<{ߪZ }b3odRZB2VTdFi_\=au4s˿ߛWC}tbNrƍӪ]y .آjŕ?DX*v^M/9xX[oet#`XZF%%7ϪeqBriY0M/֣9#hq"k0x<@Q VT,ƶj({1.%GEgX=ߓ"~2L3t95 >Y'?mHؕ 7Z:ì?/4Wu7?>L)FH9V|aW!Y%cQf>KGeoy6P9y;j#LYvp ϊ94 : l^ I6Mm#Rd\%e"s‘C?\GozUۑ^d HN4F}2U JH:VR)mlZQRuٲ^/).] &3%-LGXI2B0F%ar~mso6p5+3(dO q lB+ʄ 86DglL!/"2VS ,u5J2(1bt/7KbϑUpRΒ ٜte"qpֵËG;pIsaV30,E{}| qpZ_欄(_fwKx>vXIGϪqbT (8>x_{YvA#Q+ *]S|T@Ԏbb׆"GPq9vNJnH\wB KfUgxgE N0+}aI|1qڄi!&E~xUNkֺ!q0Uȭ/5=׏t:880πI16ZѪ<]_5*TA&V1dIgO򬄨O YFwbh D:t/AYH"1wr8?HQ]W2 80Bg6S[5ГPhc˽#QYxb Àe/+\M |~(kLUcBbAf1.rQCΦgԑ\v0cҤ:U! O9U&5} 5Uɂ5Δ˩#Dl!$!#t1r%q.0U^+ףb=!35I {$8*?+ P͵i;;$6AJpg& h.."Gq%9O׌Oc,] -\ G_Z ڹ'1Vj"ۓ6ruᮦg!BOʚc(t_{Юƍ=orЛsi11f+9?^  fG'}V"UJӳi$x̾8*hQBa45;~Vk &Ct&a8ZRb4}d4 pp{{0¶.h'QcvM\ C˦\[kfy.u}H;Qdtf_GW36ͭbw|{yuj +[=,8|~\yq~S;!ŗ0yUV3qz$ni4ˍdGX pZ~W EIO~D" hIOs3?j|tx. i|ԕG1& ؟ @t0~P^c7`ӑ @1I1qWU}LtUGɳ/ht xKaZ8! 4sNśh8) ĝ߁Z5G9Z5X >^品#=TUU|9)=š3[^nQ+v:idrSbֲMD`-^Xm =yz{m]OGBhG zTɿ@w^䓊p4KVCeVZ.{L}pQfe]w=W] Zw?qSQFK-I`YjIlAZIV'< ى0p[/r\ZW'mTcdYXN^UO\WPvˢc_`**~ߦPۺ9RCaWhx K%p/˄ᆸIuڟ`#sᲦ'c=Ї׫H_5[śٴ YϏ/ _[I_{m3 aPOyDc=ӏ0U,lG=PBh A6_CFy`Ol9g'`쌷.6J 1=2u󰒔=~Dog.S;AOKtWÒ GޚV?B NF*N=o )%#tv8@\sn['iW_:4xE?$ ^Q<W(F ,Kw*ΛW/އ7?jLW.-ޤ~rPQYJtZ<(]NG{HI0L\D. JU>D(DQ' >8H}*T\ñ|2~R\U:th2c3]ֹ-Qc[`9L5J&Ih`bMϩ2np 8$z,WS 'Zr5QTbϩ9eC@f^'͋3n⟭S"+י ը61Ж'5`Ֆ4R5RP" eϥ\hE8ߎ`h{"ph mD -cv M9)9iXq)QdV@k,#m`q'IhU~ْTF~(I5a4D)"AuIS-|Mg!mJ|V7kc8]f"STVKy 8#34'<XUdJÜXD: E! -["1A[N/pŷetsk-JVx ŭϠ [&m.H; TKEFs 0wiEeE:,2:"#rBhP#R!n%#H$e#Q ؑv`hM]ʋa4O/v^d)u`\5>a߇ZUp}7!/2|ӭTTWuÇ͝YRz(UtgJlȆ_e,|H:v/n4pd]k| }"'B337?'dpwej7JK5TNZ(`ܿ/Xv-?Njƹ@4ie. ,H6tW\$&]9=r~}°jl),Z ̣i^yK4 x%ꈣ>8%nHx2sО8dP&p$D( B!xd0jb`r"hIe%dF@;vd37F5X*#%啵.DꭴiF EH@ -d)QI%3Rg%b\STF9QQ`l$#WE6XG6GίDZn}Y ɤ;F+al{֚4^v 03O6VKa]1jcx<GSrrH0Lt1=Iap$QYhs4^D4Ĉ&wB7B)5A-%L: PN%(#;lT́JqrJGN1K1 M=3P >ڈ (i*[q K y(.5٨!Lu eUn֧I1j$-c$owH{g#[RXk#%`)OJZ8E q>* ؉vqe#%$88܉d2im4xPz$SUbG_ c wֱH@yp+DFEVP&Bv Fc$vgPIگ fB!@0mtc4}8'LYuCᐔB@ [%,o"&XeEr0*H>q~=n} A 70-(M:ڻa T1 h((2 \Xd*eosBe[[#ֆ8Q2nG?s3q)υ@={8# ǽQ0 9+fvqʺZ)O4j(нL^`|D5'` L̀40gij[oSZ`v-gY{6+E(!ln4H]`"chk#K(q=3,٤)R.[T̙>3g0FJ-$#"c {S}<5ߍWzItM20Y˃M62ly4 R-@L VȪ壠u7Pr, /%S "t*IޠqNfdւOi g1tEPt #kLEGW c=8iWxgd_}ތH)SYl*U1ф'dT$M'Mm:`" DLhNT'FF`'Nt[ U!d-ꁹ,CSj".P%5*Re1s?PR(R<IE]M7'*EE83\gKR J RfX(uJd\d"L6CtP%HG&AhdH,1DF Y*3$GХRT+" UJwvkj].i3 VֶaٗKDN6 _M05-]]9k56\[VVmH`Kvִ՚Ur'REi,bXȌ+H 9O21 Ʉ)F뢊pqҾ`*O `)O\XpI8=fE'3AЎ}qhoP w o Un8e% !PG#"bJ:ŸLRPD" #Ls}`R!k,I۞[=pjWցaN@ q# 7H<.Q̯Vq>0hvqBk!wgYHx 8`a%;-dy3&0S-:`*W6lKuʦ:-w__mdRK, 0U'#YڝhWhEh)A&sx.=[\i[~b>o@˽ZʖK0)ìuR`޶ɮh9>`e=]Z=pJm^%(`:pZkޔ2O5H0;q)ѓ&ylW?Zr)NEP剼LhUGB1,WrP}vЀBY(NW͛1żZz.P'MWſŵ|~yMo*^[8] ??Zr~)gqI=QzROQqԃw=XȀs.=xէ nB:3ba| Q')e@,zල[Zl?r>:ʊ6^x 7sVus ˼,|x>_<l: fKF )7l'AP݋?./wAEP~}6Ayh )p^|j4?-1YeϹO̻ PoM4thrk+"< eqg>LM||Cw u͝qvX,kј\Hn.נe[2Բ:ka.iyGUVJ o?E gqqi FnɳX1Xߕ1qaF1*iPDWs??|x[(Cwe:qtx'YO}B38{ڝ=?qKO1Xf*o5 eY ;G"4gҋ:ihICe-cG:r xBc; NW@ax[vleu2x8b+Ϛ}A7P̿th׽]jb!\R۳E)[[cԳZN.% 㒳8KqU z7[Ey~0溣5~(.A9 UH]23>o_%&y2>-;_);QP\$g.3b:Y.^&4C<)*dM+jFI_vx#\0KX8^hY5$[gO/ַ4BW3Lml>Y"3,; bl8r̈o&Axuq} v}{SlE8D£TIɎ aSs[nu4jσ -V PZ]s{ۖ6FoGVm߂xMRus~3&ůA32o^KtrW8)e֖6xhKQ܂,A} ; ,'ݖg)` MX$NQM*bdB;y WMQz:ql ɶFJ83$dR\B7b^]&8F%X03RJEZ:J ,Mc'ӱ;wDFp'`LGF/+]Hȋ0zthJu81X?;_C.`6nl^s9ʟ<q݁N1VI:ye.xJ9eÝ6i6jԹ-p{̇b?Cg집mړq:t)Znry&*ռ a4iWwfb?݌Nhh_e7CX?pfcm,e<q󱋡Jvl[`lVmrEKNm5ЙYdx-QdzJnhT tY&6F6s81d9p[kŶ,SC/hhܜ|tϫs:'A'~| (2MO'mmwDܷ#T}/d2(V hS@i }BBvt},YFEL 8! SDc SqJH!psb̙IJ(!Y0X=FgcMD_޹=o6>ץ>wviCNfr VUʣȤ(M2{f{~*g@Z߬TPcFoX⌋g]/^-=\b283ԑR!cQQ 4bҹW:邟N?L)2\[ cp]kݳ ]'^s+OӆUN“/7YKy?UcasΝ֝s9wZwι;ܝs=,b99w;';ܝs9+:B*2..K.K*?R)Y\߹f]r}\%w]r}\%׷nwXw[]zwy 4wZW+i4wE"͝hwE"͝4?i(F0/S24"ȅA $i~;o'lA)Ԏbw9kr~sdܼnV.+ۤ,FY$uȄa#JP 3YlY7Up~5YĻ(lL3!$"E!\$xxZh u[\DB()*d0 F\Hj"Q|0$aD0I3,,KD̔ȭj"BNOݕ57'+}llYdɎ:9X 964͞j/Fv&:*/y4$&)k-%$yyU{őOT~D+CLhQ[!\pC<͖.bY<8Ջ-/rآ >xjPv0s8aw}0GB)][ඊ4dscz+湞^qjIx\eOY iWLe"*Ӟ M.yhDl66١}b޼맿dgpo`RZLc::Z]7֟58d13>V~YGx$V#7v-qZE1rFtd.a$VE{" o颐,10y-d!r;%^ecU`G>u $m|g|}#i!4 j)M^ zw-ǴtZ 2jEіx'Lyy PSU*lY..jOnC=dj`˒ʤ"'2Nk~E>UzĢZ~1x+UYIy!jFd)D`GcxT#r목|_(CnTu\po0^b1bzF:aymCWWetq\8O."׈kT5zFF..F֘9SzG88O7y@X5C9-򃏈+[>$5mCu%SCr:yc_.UyjTUsLmLbnh'БP|<Q9"|C=3䆮,Nn&ZEW<"9kI W:2⋴&2*ǁ8p?;1zoǁ|8Pĭ~.}DqҌeCWu&pP 8}{E>jZnp.Ň͘8*8$࢔vj\Jsn0ɒ4pm 2 d%!x$TkעS Զ?Nζenb2(I4Q-]g2 "x[Dk#Zh`V = 5*'fY d41*BƐOԮ XkH )'_352TQ?8E<}Tk=˯_vxz9;Kk# 8ãrT;L%漖|weypdJ4Q>nѺ`]c_mIldqɒ fU2JJh Kdrv8oxGKtd@ɪC[u(*9<>Ua]_w^45p_ aȧ*YI䩁*xŶ *uy9& dl Bp:8Z=.)iɥ]GU^˽zc&\1 t M`זȐ"*2ǐOUc6Ys)&QeS 4q<C>Qۿ+_n.peTB`2O٨OvUT=R,_>;>"qὊ9Nd҅ `30Aْ|ȧ|1dZ0"2Fʊ |;|N"ˌcО.7Mh[}37.3-szes4+uE U Ǡ.u׸&mŲ{?/Va3";)y )K x HJ;ɜVQŊҖ@Q0DP7S"_}W< =㐚Nhlhl[!:ߋc )6 qD ['ZȀyƐOU䈃cXh4㼠s RĄe~,ES81Y7,0 @,(ڮ]( j>H( Vښ^h,1.T+@Z5c$,-[tfF`/Ǝ*` hjUj>xOS5q@FC}JWdL"2zM,O]C@u6V\|R+/g'b]?]RNʸ\ 쏽.&YPCS'.N47M՜xupFUymZ<nь !kk&j@ H1 Ds=LxO]#p㝿= 1zWG=0cO{-nDDɕd J"Dp[(kY`^D#3o=Ks1:~}מI h8/g۳ 9'Xp~ɷ~78DEd/$CJњL'2$tQ.fyl҈8F2VyŞ}gA[lwoC)a8-*}.D n68W 7_ۤ c@FZr1b>uWfvv#+Pd4sk>5 Jv<$ ˾-X#u'nQcDaѲǯnfA8\Y,tr4=Ɯ`퀌c5=]pȆ kzF\3 amn.ʯ E[̱YN0aavvlCv` S]+U,:Eo8j~7bf;*)5 $Id'M1"J>]pHT}=f y(y bxK{Hg["+1%>C2ZUEz?/q$mPEoi(;_1%0Hl-I.#Qsd4<h1zLr_˷A6x2ZN1'F?幀K)" .]>WC>UZۉ:Pc|I_8Q3d4,cf`E OxФ6a; *8uktjp<:@ ^nJP5Y܍r%Պ ErӃLK"|5dֱ@ r2vZxb'}Ak7 x e2$mN7p9ЊQ!s<{w7ޟY^xe@nM39ٯ~Rk8ظ}`7'Vul:5߬94Y=mh=]s0 ~L%*^c|?aǏ8nuT{#VY%N`^+11d.b켖>9.x [C 20q` CP_qj vuۉTfd5V .{@KDh)YCjr ,FZFV'\DK w==`,`g0<%ְ`=&Pz9Wf| iIͶD(5P炖RUQi0D4E.ݶɼK!/!D׳ڈ*! *=9S#ZS/[# ""Z Z'ܣX\Yu r Wj#ZS< ƣCQAuBG zFm . 0v$t;^b D ޑ=ȀL哜\%$'2<u:B*reEMމɝ%> ѮrNv)u"dC^f~ru4ۻy۞p3. !_f8ai6x?t F>B$ڎ8q㲲F(eɸv(QqHY ' *fӰG~q8ݐ%7mVoi[5U% &W9V/VtĭXED6Qqd Ch+{c.!w0w𛊊zu5'!|SQG%[CyшqM" &v'ڞhM 51Z~8qX]+"L#Am͢u(~sWtM\4(n[? XKsыoO 62`?D<Ao,gi)_D/ wœ"DRYXDq5"4$}$S$͖.GƕHȟ)lP+1!P"~Mv/&6Td&Ru\n'uRב•Ҙܗm PD0ӗ\am#cJ-EFsW _c@J>)%&u 1HFkg"f z C5 . ntAhHy4̓V(wߎpR2k@]mpCC2,R=%Ŕ.,S7+$4Vp!HsKz"7+n+ H16eQ* yUf+d@μ4QW0,@9a}Zp9 IaaΐYX֜ge<2W3Y| _)`3!M ֔w9ΣU@ <9K@/,FĎut~xu'!G(L*YG,Q xb+Ps,qSf6Es`'_sZ|vOdi q'ң.uSh;@+-V )S49gaHa7r!IWk}۱$$%i])-l!&:Î)?zu\Cb(}ڲnFۓ3>ђzu\Cb>(e0lx/>ۏxad O ۍaw&U򧄛쓻+)y plMd^=]A>.x1Wԫ_S+1zQɂskCn U־&b5]ph. WsCU%N,p,w2}Uov` &ҎǕh8݊gݢks.X&x| |A> TFŠ|d?7-~!~`2-؈(A4pP ~>:3b$8kw12,<5*_W?ڮK&%^ 3SF?UU򹪟\EᠰJez|o0@`~׃V-BoVjJӷZɝix򶽞r _.8a88@az5zG.gbGE2rxz$/A_dxAp~dz67Yxz[5dFgv]ee| ]mMoo|-nUZu&ݯDA_0CJmמAUkD L<ihR.n/-moᜏO`SS'O/F/zGɠdə>=ښ:y:~1zATVOh _z{c).݈T@v~fOa?α :R*EW;x:yplSC*^_pI"9[SQX(r7g\Қzxk)^18Dc1Y M Gr N9̥Ӧ ;Fpɹ pfqLsiFܿB\2x7u7N?xz/g8q>87Q*͇D[crWR_MQ?Vf.܍o0^HcF/o*A zdNgE ћݠ?|~}|aoW(n[W/;DQxZ݈ܛcJ+0:h^C^,.0; eT\ GAjUS)wx5j'5;B ~[чT>+42NJkqdVigȬJ~djN5Y̟VK9'td_#N)T8i帱2GɎ1aޖs&W%~9zhөb OJЛy)J7 T̙]ZvEEM:1dU'{kΏ~81{e Fjq„GםL%ƣN v=xZc!8YHZ0kP=SP',Ù [ :σ\Yx݃m/F/RESug_9I/I<*ZoUwN ]v| wvK4-~JLpS j b-#0zs|.\;`0}LS|Y4ͣ*ϱJ>QWWT~%zbq"m~o; &]ԭu`oNާuɷ1o$x l8f#oL"!3_(%-k$A4ONl3mx9"w?).*h8Z8HSp ΉBmHUFV %<2gcyl}LVșL"wFsx~fsAm3ޢ&|핗`<7m:Sc[ZY'ɯ|v;|ß={ؒA|T5cayK ertk- N%@/ouNDx o%ߋ[O%(r ;<ᶢN_>l2ESpݚ( n ~v[vcd׭Od7Yz*nhsi(GZ%N ms!ă֭$;iЇ޳8r#W|F> gΗx$gHcI{ݒZD$v2XTbWUd=n4UӎMrܛ7O/^T`@@28@28A\[y`2:< gOSmx8܏!d,zab5!nbDΥs>l?0^, |>̪Q0vQJF+ͪp"%=@0ҦNO[ĩ hl腣'*N FFPj䀈I̯ջ8@-P2oPr5 G*, o9`[ XHÉlu?5Lg<k끅jڪ6w̤T qhGGTXS|&9cV&&3Y1y/RvhiY"P\O71zf|a (#LK @(\!0:qǭJ9׸+`=A(4ƕrZngG̡\׾F=`T 8Xdb:r`ayF,vD94{x9|HMy\-R` c%C9k r)8{QU>'znٖ0;~oL(M:yMô9f22x QpL<8igp B_fkepLhqe[/ʶ9>'Ţ`cj}?_dlW F+%si5";?-^N}4ٜCkڞZ>O<ؑ e&x6yzv怋x3$A݄7qΪwoݻfbSD}mI?2ofvćI5Я5cX"q eNf*v8.z]_~Vq tĎ7 aڀ_MiX~y4@;-YVy|oF4\:ў?oK8:Hْjs˛3K|s=XX:rd9OsRHj!5 >܎G:}{ z-Q<20=W!bOnfvzSq4tOO`tw og~[*aO_]*F$0Ϧ"JbҼDOb9i۪ՖÏvv6>,` o˚nポNF-t={~YX&sD>pAm0￈5}4lX8<-8X=Mկӏdη_ X=丕ͮneϪՌEԿw3EnCv>Ё* `00LHyCZWR.P9ɕ1Hd(6rP7r8EX0b F c\1s1<V% @r+0!J!lc\~9#ò,SkKmp:g#KFoh#V"e4rU$iJe3:u3uswwN;JRqϯ=œ+؟Ro z烽wΨˋքip\MbzP赆cN1E:* {.h%jk6Ћk |FpL]SR:V f(\ĺbHA@1R1'$*T?}%#:}4ֲ͢  qA9r;0,u26\xo˛Vn3 z},AMYAf|y_\6Gb4zdT稜v|?a207x}* D9kR'6~s5b *KeưP$"e%g2J#TK[*'kŋrvOpUxQ宣{<0B: _OۙW.p*`pLJ_)+u:cIljxܾ6X,ZvRQ q7+qsU=$FQMe aVxwܓ2ΦC0㼜>͊MR}IW+ cWp:L81@C]Cڄ/rɀ^G;HlHu[d6UE)=z`3 68hуAp#/능}7Y<6]@ ,Ey96K6BWvȹ,jw9]= /ԟq{ԩJD9/uegLcjug \EqᒩYQ3UuTjwrQ3EJU0U# 1*9`HIAx+;_p1ǒY/6?g\)}ԙfievG_ڗY ٗݗNXg|.gg"8WŭQ5phkۆ,Ϲ)矂?J$dVvf:viߢۄ+n9& XCOﱗՑދ豷)D_osc A(cVpЄpDNdQr.*+ZW:/wڈF\w:{_zvǫtG' ݟ}U[MbM}okW3V[]v$ю4B@,Vq>RT]vt6'hF% Jǁt)9N0V$:A9W?Mīr_=opNJ {ywH%6 gPS~u^kS^鬯58K6HѡbV()5yέ!9c$G57HъE,,픸kJ~Ţ9o֞˓Z,ק{4hSdo?ttET=sv_=On-jLa(|Ml}ۘ-^c}\H; N_'cFu>=-ty..oIuLSt$~Lb&sTw\pIV{ܓ<}^Io58#8kքT; 6K"sAq U/UZݫ3rg=S?$C:\`VeDV3J_y\d䟼1]Вj+o{OO*o$2_qxx%@;i3QJl`(b{[#KaP_vy^چ#0S?ve]vnO#alك.'ohu)'2#ƘCQlW3eL |jAهVbjmc8|e6fj06ppJ\דWdא`$IՁibYWYgvdĄz`:YU6~u1\`a9'$͟6 ݛHrPq&婲8n(TvɓGL\Guij3WxdNaZ~reeRaLdyr&as``UpLmaTv%5WA4h>_>f4/$:΁ѓJր8"MBv9 ||deqav2qC4xxOfϼF0AVH? uRA2G_ d8`ΰ9߇J%*V[,'T<(3M XDH7.7݀JBIFLh,Wc>ײQ-s#gj{B"ۨ13دQ^nӭlM8pA ur?_ pY`[GX>C B!O v -߻!GLNJv[ rn >>w<*dqI1o;Y9=<*#貑uЌ$th+#QA;# ,rưGi$`0M ʚh4N7=<*_n/{Z4H[&S4 C x `x]8eⶃ2LV(b68e{xTpx-Vú0@jk%Āa#cx=^- 7/ՀH,% "&lvނ2BH/Ƣ{c%7gp \ꨚLm,0ph#;/bvUCh)l$:r,э 7XOAKܧ[U@e5-gƅ;y%:S@B ӆ4ƦLl,ddDuTGo2- 6f3#.,4`:AQyc2ޜh W ]h6Mj>ɪ2ZD_K2j w 踤aIӵ&)r +p\͗+azxT\^gglŽgP-] Q=""8G:$=F&+Q} /P&ujXaFPʀpUsY"!+=l&=H3## Tp!JOЯKFCͬ$qMADK]L[`pPRG(5pi7<*C |ܬR /'ze@(Y.fS7)ˢyH2|G|@D֬:zNf:(PPpVNڨ`d`E- E&GiJ.^l{YjA`m`.[As2rLHJV>}EQ+RPFU2`AĐd PzxT!b,gGBnx<3䋜*cU@-鮖q(+·Yᥪr=p mqOc;C*-E)e-@ /S9i,U 5'Y`Ua\͙W@sr¿(Z>es> u > +<3@1}F0Y `3; Z$NjIFkI` 'R÷<*SPO`F(M8eA:6Zn#q;.ߺt5brTL&i:(w~]\4*l_X4EJ"7)N@BFE] |21üϥ8 g1G8zxTO|/V,=<*jvF4gɴNZ,>20{e*mHV9KOR9#ɭA;X*:Qh^1d q}̧2N:<_gڄluu`,BSbW8Gz|z"9(@2@Y\jg|Ю F(p I OxD2]n@ GU:Ed_•o:S5xɗkP,zAeXqҰZNKxniZl4xY`qinfXWfV)Ů2jmlVZsgq}FSd9zxT6f}ڈIp6әQ`*nHG8ڡ&@\RI0:%-HMjxsQ) ߏ-zF8TFIg\N;1i*t=+?\*QߡQϦG+B7W4ã28(_ecWfO|[Wec;cP=<*1Ϋ@ɨ(|ye(zxTGosu?(jBã28<8\ή)9\oMs\o Cjfv9+g4EZصkCv⍍yVkS1X;zyYmiߎUehy (qݓQVisV!KeJw8#+Oi:G.kt!XhYŏcjD+֞}ƜܡVO1.ajji-`rĀI';jwP_OS(r{o3v6^z6[oWfn?! Wz?oWxn /}}u}{[5] 9 ;=( NoJc$nNuw 'ɧNڬNgӞm[vDLEld{'; &O?\ ̼:=oN)$Oz[ ]Og~덢< L ; #,ɹbNLۆ+;tv ;W'a{U@P& -,I]P7?nS~~kncn?1$cdi6WwggF ݬ=q9='H@̓‡5 Xw22nrvl?>63xqeO7_w 慎O ̕8b,d︪&u:JA;P'z914>6$lX> L(o1}=^M/=6R$W!=DT$ n +$1 >?.V.^aiqW *@Rg퉛./`?O?oן޼D߿{8I-`[n%h ~+{D=6_;L̊x:3*/E[2?s$0r_X /K?I}!  l곙]^>ws4:45?66[h{ f+H`N;0ysӸ}|V"y+TG\Krx5~Y-ӂjHda-Mx6#j,8z~/^e%qOG 3=wmR|q:qG !/^.lS+Rr\s6;˜<`災62⩧ źNd@m ߆j6!UAWaLU~10[4pć@)np09sH[G‰uX=+Pt<fHO F XdQR j|sWL#,8ZI unr3@AEByMZyf/R~R:nǂ*ՑWQtJ~gS3ve1Pe?rąUǣЈt׿ȹr?""zoC&M D6J sG1~+CZ]ἡsWVwf{[WjimZ^Lbmx&Tֵ,_x^ߜ+|O)0}3+V뒤?l(d<9MͶ'5}2$Ncޗf . s%6uK}?wWm3~) ˘b*ނvmW Cn|S 0=ۼbӓ`!jzԃ4ݦgme _/6Yǻ삢s˲erʴl^aZ>[ =xi {b湮BF\QّX,rWڭ @ iǣؕsF^s#.cῸ`k{Lw9-ֆ6hmc!7 Q&Ad*2ݝ!j"jhI^,Jv,7[S-\vgڠ;\tnqNi^1۔Cw~rO_sSą\ӻqN(Տt~mJ^Wth\ &f%[ >}Dm6[}0~8cPgpqP>nE9Ft!}00!6dZ2D)p &S%oU4Ci1@jn88xRʓ]ŘjV$)hZCOo*9WDVz+7$bq/LdbC;Ս:Ӕ}*'ٗ)DuC&iN뙤{Uͺ$xߣ7}{"i +g!$: 2#*zk Q!Nc$܌`y>4MUMSx#%^kyl% X :O;&,R(^ESBْqQ4Q iRCptɲPQz;gY"V@>ȃx{cԃfN*"~iG!\xb62 7Rn&( 8&R48S㌬T /*f:=݁͟t\Tt@Cb& XFs Ni%+֠%QC5#9d5IVQ'imHʅ4*"Q[G@L\G'ShqF,gޤ[e@E{GRaRECXqӜ:d,EGL29h.eKX?9{lHd8 laY ncҐ1Ɣ^`"nC5&bM2U!`{Jp&TKY#h^?GT X_-rgr}I阎 %wA X9a 91> /BQCVZE\9U9m lѦ(Ư՗`ծߍbcT@wU \\ДAH<NJ)(qXT{/\D춀0X|N()مIgE*bנFH"I`c `28|麀pW]@8TZuRK֗A7gՁߣ#R5r8Rِ\$-*wO5G'4(< &|DTc)T)cb!Ar;ܼJ$Xw"9;M$R4v%M Iwkg0,~,WʙAP #ͤ%BΠ{*HY'R`"irYwZY!W?1{0*\~Yj#^0tMb<0s!0$) ;@"p\R-WA(8$FX$6" C8z10ZHP;fÃt[z?RNPRP=LD{ʻVB]lyz`4ѻ@@CI0,E"00ebH_hq!h8|ʠB;(N4./T٢6zX6eɾHՐkpe:h,X 1(M`%XmHyBaF.1 \X\"Du]NBDOu"MZH00poSZ`8zA rf[ FK$C ݉n|qX3tȎCӴ:3<+/[▼eYqֶcz.nٵ Qv1K37@"ek\x"gimWa2~#ݯD :fٕ`HG8sK+=c"Ч j@*'} k&r}Q6M=M8{,}θr+t8Y[g~~s\?q'E?A9;XJx)`H,x+d,Vgkݢsap) %,KȳyWٍW1şvCaZVˌ]{+F蝍,boqJaF<S%r0!sI^54TxGDJNcRYe) J34,A"A$ SEn+C~*==& +چ #!Ʊс@."LZ22C:g)ⵌHOnQLI9"&DNAH!bŞGUSRܬTQyUvd'x` -}kA2kܧ]I^^,O{ƴc >^<9]U觌 ǩ~G3QBV]S ɞR" ~pf/C_A}J d$AF"һ@)K9gpn׽ x:WÀ\:$U]˴kEw5mΪ&g¨ @?Q S(bNU+gr!@ "ginM7i Z&K wEF9+leDYy{rtŸŅux3̹2ǒT/)[sUeYw ̷ 0NBM-q[:oj47#Ic3uVy "h\q/hx='t|068[%h{A64V\r4~ s I s`'E*iH@8=P^В# Ccjx-/WK\SW>m,5I ɀF"^Qz% UZq0*/gnpPgO_}:}}y7?{&ogo 70/2cS0ٻ ׁ}hpNl4!]5 ͛iw9m]ݿ^ ;/*>+c+Ќ7yLS۸ dyFlZ6D "Rilz<,Ec2HnVC[9s|q3sazxw&"ך+^*L >bK3"цKb`AkaW]Gu] o#Ǖ+ cLj 0XLI@lOdW+)q"e#}[lE6EIFͮ{udxٛydA'HNC@J H,lTb%1?X_F>62LJlILj-Ԕ:`M䚝|}9%. &cᏇ H Z{1U:b Xe+qYBB-+S R RuX:F٥ nC20Oס'0>heNē{/`iP_RC#O7Yts>=no\1EalajYL hp8Wj U6?լ=B@*'T5PQ8E਍4U͏EY-ډ] 6zfkٶHDHKhB)s0SsE]/ ugsZ&@ĵ r B($W1pV{tHbaJ[ ֺ/q/f%7Bm/- J)Jx;[E7煾N2[w"4ahPANը BC aчFMs`j'Jr94H 4?:zH5CQ x-8z oګ"kp8%b"c1g;Ja8fva-Ԓ[aq3I VÏ$]QJ76?7ÿxSV_jO}3O[n:"M|ˎdsk*+[A+z'S|Lxvg>K? Dx^rÜ9Q [cI"¥פeDxB|#&F"<9&r6ȹM>Ew9lVJHH!- < 3ǃ:|FvAЮy렁W'ov-;Уbֲ[[ײ[[vknm٭-e֗8C-x-e֖ڲ[[vkIk٭-8jѣ#. D0K\G'S3"*<za 4$AwD^33Lu;"E RrPaQiƝ%hD0AP4@2'ѹc!P Waz=u9 VYWXGB#cr]DH#.6edڈ 2**7> 5_)-Tָ)u|k\%L."v1z pͿ5>Lf7dVf t3УL7k3B6h $;J]1 ?8w/@l-Lٯ18uP SD:s%Í'`o:@0ƥ 誰\:$U] wyuI`_OECݚŒwjaxv|\ ȳIqungO70L;HP,|`kpatQidɋleDӻb?Tݯ啷ŅwlCg8tgޙ7u9T/fv)I & |+PUKUn!HR̲a0,`ŇDmnzmU:dUUcʅ$+a!aȥGߍC6Y2ga馇2 "7v/" KqB/$W2覈RMQz% U/3׻%L޾I߽Ǐ|%&O޾?-$xU09 ցC^9~ӄ447k*Vi䨧^]+zXR_E.FxnGAU97i'nmܭ*xq$8r^`#"Rilz<`1j$mنuy,zLБE5WLFV8 9|Ė`%5fE 6vSM'mL&^wgUZtb-mU-4%UG;3K:loD_٬bV[BeDABZdɧ dt, 50g *W:S,.lw{ MӮH.A#%J=o馟ˏ ߫ =#fSjVdFqI#B?NIoA/Mnǟuf}w8jγnl c:؏W!bX%IjcHl%" o-Hx*6v1.KhP.` @<.?IF.#l, 1Qn<9&0g8! L_-ʌM.O=iۇ ^#Ě6Zl',OJr"jB1Oh`D&i|!:.iu#LskFyOԮ~B#WJ5=xj2G1drAdv}'41xx+o%4,F BTwmqH}" 0Y'3Y` Xl`RH-G--veI9Xԗ"?:t\);Q^;~Z l eO⣽.B`~-X >YÒUXS9a 3$kmPK"f%N9'=κblʵ*BQɡSlKԺ #Gϼ]}ͭ([h(dNL!axo<o0P><%yMʢCs~g12Q!gUy&l& #a:cfIQsTgC*qRHhPc jqW9(fg dL!ax0mBP1Uک\rFgH̑K&Q̛JTfhe ZG=z:Mjhd&NWnЛ)z;G`hG޺@JHX"d%!m.ySÄϑ00&WjxN,us$oE2(i.}ޒ-ܕVS%p,jS3o*; KAk5:~ZNLzңmwZUJkUgk.steqղZ=(?/_E hh MZ% kAG-4[á 8wY_Uܸ65!e_E.hNYmj95dCym%/|J29X0UZY=& `;Y!3bJYƗ`D}ГRI (ȿP;KZ"Ѓ\?Aҽ!bF3"sg:lgú3_UzM(hEKJm gdF65jIRtIn(0dc &0^})gO舋LM &$jhi,D[%TkΥ @Qrqrqҝ*O$VCoUØ|KE\L "FTcxTŔRLi?ɀ,Q'YsƑ-;c5V5J49u1m܎+%z,7LE06̲H : )WAh=RC& Z%ED`㚌rca[G7A/W}a-(/w GGk"oxH8~}=s&3V5e"֦vFFtƸh tb3D=q|'UtJIF V&Α0|Y`:@::oLE 0~Ih9H+cZ~VYd):;qnA3jQIғ{Ѫ,1Jރ%[։s$zYjTMj'YqW;O䝙9F7Fֳ0mtyCI<[ Lϐ0|~Sʼr0Z+Os$ocwf+VX㵛A` by^6*iJV 0C ~z9(zztjNjg''|~4Z4 U0|mlJg qU1"+(6K ! q %y樈Q6ySfϑ0ڌfEBbRPш- NLd̑0< h2*JSJYc?6~!jO5yNsRI o+!uZH8͇gn&D/Xβn5 d T'l~{ h3%R%nESBVBAJ8~!/ O^ 'vAWTp+qI5g0x51vZ{WZ]xY_//VfYW\kV)]0 7seE|o%}*sh\]-89R1S:Z%]qSQݙ~*}[K&=?3-$KY|TE/Zr5(s XX ٲ .dJmj|@6'D*';lFm.,?;ߪ<5k1 B[MZ [$t=@`uz3iskk}˫oQ$h  z ֥V֤͒9\hM͔lcNGwPO䃶YKӬ0T7դ$GM dgZ-hYLR{`҇:!kB);lAb݂r62$Mmf}9ls8+[ںz 77OZ,o<= 8zGߺYwΦzݝm̏z3,~ϋiYe.I xȯwA,I*BŝPH62O^'σlyV+>'o.>RWm^sBѫV(C*ZsNb>Pt^wq;.Ϯb<:M[^?Rzu' G˴v#Oۿm ~stqKn ބպ&䇤w}Nc]LzK0Ӌ:sUo|{yꅧ7َ8ugNvrfd-Dėϟ] /ZkiSpgKvt`6sSU~<[M-Olz۪}X&\BzNog*+^EwGjĚk7k/vY;^3y@h?~֓?ۯ,dOٔ+^{G=Ph6 1|~g_~x??=~[Y"r) `܅ց;M6fT޶iqNn$.ݿm.B][N7%?Y]`}z]f-ߞ̇/;b9"IkvI9]O:tnD3RF2\&DҘٵ\mAnZCg6l#nfy{%15lE &#Gh9g|Ζkl J.6F&ʭ yއ!v>;{^u4y{ax$mf>xًua)5&͑0;^5(bPǩ^PYP%ndVݒ YQlj2Z%L̑0ëeo\ r"T'Zm H>\m_ sVHɨQ+ߟzݦ*ȃ0|0j.N[*ZIDjL\b/M#axxWcSrE{E.)OT#axUsL]U X:lgY)s$ o5 -F'K.J[@QVhMҞ#axTj)iR>֪zT&Q2es$6V{۲^T$1Яkb6?G {ĭV~DO,! f̈́CB]&ϐ0|/%3|1 qQ[\42<2Gho5y@ =y!yAqm #``xVj?<[eF*OTbW>?ڙ$3 G%3"fVzQ2(Z n xϣv*>_Z5ׁeX$˺_*fb{`Ziy=}4%:;-Zl^R/O./UtGy?\c.bhReR8 8XźUϞp9p%9\+5}^xJl3U,68lF$ ߁mpw5 Y}MI .uojRcf_zлaIf^ƍ6 ż*{~1 g7E)~ScR٣n(އRt7CPjmN*<sݟ_ޘܼ-is70cT?XSUe WuTO(XjuVpsg$"ɻEoB4<;o V)y žro>I!867=:9_WK?[ӫcbVӂT_8ʺbj+o$qWv%d[紱jSM jAU.:0o#yv˹ ( Cgw._`N2GӾ [zJw7ag$HioŴK3?ǸpC?1ݝ|*_\Y1fv`fדƆا۲=E(`>~_#v_y}d| *YG~՗6)zZc⢿>aۊ;aJwlv-&pz::X L=mޮM,:E_}KnOM3ewck5ZO_9M:MWd:1—vӐzt=wWi<|hM1;{J{fnf 6j'q~UWYkna[004Pٕ;nO3pP>,jL;GvS>09çKPfkkmN7S/Ub UgK_VKoQ?כC[YćF-k7LSpmqVwRfd]JzQ즠JΚT|q:+gIdRt7dJ.c4\sVxe3.R͚a¦i߹~9[;cqNTi-䗴6{Gb:0JHRڎ(9ٜ\?k-\mJ~m דׯ/L,\mXTi:L )%5H 9в{C&4a׈f`O-Ռ iTUiԒϏ=„"Z#)^6sɺM(KQ66ʒ&dK*TK~c,cU6F 2k* Qz(&{/N_߾dCUl<(֩ d*X O> %%T {޿nΚ9A޴\s9::ZIͺaհS(r#钍5tKÚnN';Yn)Ǟ\1# ~AیU9k3!%];E>i}Zm$B‡dP둇 -U9[TJ6ktVzWqֵҊ 6*˂ghBJv'B\X E)1أԁt@=+`-i~lmGJʆE RқC:_|cWbpypl&_˜H .Z ql莬Nc[ 3YZ˶ E@P[fU^6f=ak PkCBн[ r| 02sz> 4vO[5:\vAjBiRTaQ&tUVi;K?4V8ݪ]۞`WN#w<ݎ {- q2M!lE@CK,źX,_4&䎊\0aLN L t(R!% yc>A.:MC{ꙬtBtpoj)Dr4GSn6/PgI(E{eh_u:RV@HnbWlr"q݋.׽6y#Q,2Xo  $KV5: ʚ%̀貖.kCۃB6z`*{M2ݙnk R5xlPBs|3JTԃ  "eޣÜ3cF{*K@˶8d6Q t56b(h1LcqLΣ'yc$ 0ES ".'ЃJF=@($=:2Ї`)>Ý.Kcl3*A 9}d@֨Opo 1tPmA{[>Tm O>ƛkΡ@PK}4 W:F U!-`)o"02j/FC̆ iDF;,z%#p^{x6POQu 7$[>+xxxi4olp1:hKUe4]C,Z*cQeMV/#)!6ϗ.P,tSZ.Z*ݙg'b!wԢ%蕲$ ۥ^qr\]URc a jފ&~P ᄏFZzZgݵs#ovr$Nky+<4j 3? D`(z{0Q\oF kC@2Q}J `!(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@{'%pno@lnl'@ }P7t7jͳ߯>spW_(Si\!f JCg]œE@Էqkmx C{p&e,$Q@JP(D% (Q@JP(D% (Q@JP(D% (Q@JP(D% (Q@JP(D% (Q@JPgA'O (G (GW@JD96pH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D UY OJ R2ZŴ7J ?y%N@#H@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D )]Z}pi}0߯;楦|~sqvAOk|~UO./!A&$O%B.{xoK'/\.ps.hˋW'+ŴMF ^DzT݋Q4XM[s<,s9D,^{;n˰Fߎ^XrF}-߲}f~ݠbi]3zp~<&AvZAk)k+0W^p#%D@iM"EFu))`1jT)7YK"jxElLNf/ 3}K]M`Q= +d5Yt8DvK/'j( {{9q/˩M:)UW'|G-,.g|G_qoQ|(.t:M˭$ȑ1o1\|ZmȨBH "b[!C}LlHp+g=<KU4Y&Ckr 8e%OG)*g% Y1rvF|b0MCvӨ/[jF !^1 Y 9%0OщVJsE>0×1R)NNBhC-+5@9i A0vRk'-w8Փ_ChY*ӥLu)]>gv/n΢ְ?yw􎿌FssB=18"KbQ)st D!6a2h޴lnq|ʌ94'8!gOrtF]u?rLDc(eTڂDQYΊdRJIv0Zi+Aڦc?nt},WEMv&a8и8C0V $GA+a wzFKb*8# rcU]|>(!J.,9:Au>X|AxKRoY"3WfWGvtWNK9k8흃}|7ދ$~DH֢v.[4_̸`z(OkjFG(b7GU1:_#J{7t/sOOAQTcZӻVLlu TJȨ*q^E<ME{ئȝ36"?畓sh(ŋQvxnrz</$NX\&u5Q |b; 5B@i9ï@ߒ_)rn7bg+2z۲-kOtS<|%\&Ősi1Z5)g s_M0䩽E]F < * C!㾇!Vk++º ;G 0!i._KZKIHĂgHSٲA?gvXGNW@l!sD  ˼dI#:T7pRr RYAǭ5kOBH$G%J&z+%撩.Ecp "^reͭΤѱr>? _s|&f?Kgf5΁'J ][uoS%{6xV^7373w滆 k&C;~ęqڹfJfj.l}V^O(E(]WpUy_[5h~wgv{^6?gW~i.NZfHΒs?Lr{+Np4}:iz̾B[./!FbHBn|0l0mfUޠF+W1b1MnG]Lrۨ WTCa!!EFŴ(u.Yny W@~X汜v~~z9uǟW- ϕ&ԄTnk:Jh4>ON?7LU!k6pvg($@W1~ݿ}˻?yGy_{K<2GUw ݛ?G}ૡ~C3Vjho147beY|qm)oRm:FvڼOZyTi'ԧ?BFhKl02IKt|$;c͝$>YksYr1m,d17)ȥM)Y ZHzloBaFB#yhyM"I/CjrYi)Y bٹ:$fV$`O}SQo$|W;O> ڤs@\Ն&4.7`޲a\MOT kZ}Jp+1I[Aq790m ts@7$cȭŋnleS 6p}_=%pYNq]_ &Ho`lMQ:ux˧GTtֿ9S}_iD׀?d6Wxaq^ mnc+}fƏv3»pŏ p4u\׹zc\nKpd(6ۨ-i1ঢ়R R*i 6ɔ4'SU{ILJ]۶)i,%5VDä-+Β[$jH9pVUT1bK101+ښ}OHo=]+Z5 o5 ofFf#W0ۄGmPSR$*ɭ5b2Xk`V>7CgwnXf|Z~,nlnzzAeT&\҆J\V3Y`*DvŃ泂-_I}W{l?tSaZe4א[ {Rr.\є <9GsY`s{ut4ԥ6n5 g;V$7S7T!{GW6ˁ[QҒV=`[WFGkiu9/ IQVhE#8+ϙ%L؂IJ\Lx&\6ł(yKBD2@MBT $%(4$ioaڈꀡn)KAR@ %' 08*FuxtL@.8FU#]ذTγ45E4;4{hGg|r,3`% 4IH!ȩQ ˌAr6̶[Dx1hi-Vo*2pP욕߶́(W$sO] +1ʳVM҉`UŌpWyV AkǪQeըǶjԣY5GNt8Bd2NGpVNR^:$I <(GAǢKInDd*@=ƘcZiSi`Tہu-tLINƉW5r-X){o!׸g@. _?w!B@E&iPMf$m6IMfHϝ#ae MF ^D('(Է \en#}渺݈L:-Ϣ# T@5\ pT`1jARh( jSYS&3/4ܬSG)'ӭr |(ީl4 GqhKׄ[ UVE\p aj3{LlmWrFb3d,qṶ!Fd9UL͐}hxR3Gq,@О)MXH-(  &m#gG9;z~5o`T|4mtN̺ŸyҸh9H/eHjAt`oz=}`W,X  4u)NNBhIDȯE4AGk wr:jZa2hNXۜ021 0.A@%B*BmhGqTmN7d;XE8<2<ո 91qgTpmH$DV mAVǍEȱ d" 'w'}*:Hp%3SohILgQN<` "8d_ЅC?G'J-UsK5[h~HhMl:Ѩ&}z_+Ab.*R+!LJy6}^qp؛'?l8#FEILhhITx+1\e {,:q4]=Wz/{El&H4z jӚEd(cF{RK 1`"CV,^z5cxCDjrƣ5a݅#cYt)$B,ZhR'!b 9+#1NIbda0XGN1@l!sD G.%,%\<"h,pRr RYA5kOBH$G%*ײ@nrArk/zI͙ʻ!52.'ak85D {K[~mud<+NrS^7373h廆 kC;~ęqڹfJfW{.5h daPNXtk9s |,[yVDQ{{=.vd͚{!JbiIB..jY1ly1--j*0d3-_~mУyiTF/uZe.&\-iNt`0W_|#$zL~#uK[D!;錆r~RLm0T ?g't3FˢlHm^GvFQ:yE P{RߗUʟ3;Wg/_¿/.ח/^_Rf.vvy|~wc/@~6mp>{zьU47-JѲZO|rmU^R˥uzb.Čf<\ P9x2+\|M-^O`R$I),IZLBs1m,y4c-y[LIc{m7SFhxo\Zi`2 \I)r>i¨і:Ƚ&y.ԲӨln|ҭJc]/'~ 7,~ QqyY2L\DG??̰g;qIxwp__߶\J~ׇY,1{+>Rvh-ipv(f@dXJXgGY?@2$&.A:p?=`]ۢ:f{?e#Ϛ!SՁr,avzFKϋOwTtˏ55*OHgި$S( :{&*.h&iT`8kllv#u?7zE O\)3#z$1ai 8Tǹ" z˼s ?$KRfv9O";3aiS \p\ei˵MiFIyo 4LJH3Fk94w;6>L/9yNRL`#eZ; 'V^&Z0ޓeQdZє֟S_] Ҏ挔T8ar,O'ԤD H'!4N//P=Fz6IJa4BOXb.NI1J]ت#<N򝩣rjyoI Rpf q™%Ly+VhHD+rL$LLBdnμ0U"䖓Lfn9RRƹt)g$2AWUȕ2e$ Ƙ4gIL+Mx^0 6 5I!*[x9x@+#;kɞ?ٓ^!`#Izp%7=dMzuIT k ]!\L>@ˉh:]!J;D++LM\]!\ ]!Z)NWR & ωA#(gW^}\o᳙)̵tls+Y+1,uhg~=ڿ#;J㑎:E| qzSV/JXQZVX[j:U'7/OWhvwXZI#hҁ!-:\KEQ{ф;l?lUڣ`?&N]GǰG7pp@v+I0r?EWC{ U:K6S'xH?^ 7Tvee4ىŮdS{]Ykᬥ^+>~uyeg $@FUz~x2vFc& F]X{\g)ZC#CiY^CVm֊_(5 ۇXeCRh+78Y%q@\}TVq&VJS yM6GC+^Rƭ|(IɬVu_筟ll.wڴ/dhf[;ff& Tb N Q5You-^l*vTV4ٰXu -1kѰPjaQway.KO蘌U{B#o*̳P`*+u(VۦJMTpLIT@t%7TBWVT+N4VlWR] D6TD-h\/K_g')2lP{^ݩų~veLlCu$<,} I>i;OuL≚Ζ}jmVeޱXNm-v :5ޕДS$Dm@B" n$D3$٠e4 B.Z~IDK,(9iJi S S+e(thn:]!J:!"l18+e0 ƛNWҶNpuq jiB+@k%m:]#(3-] ]YLPR ]!\AC+D QjՓ+bכcb5]oWzpj=bO+롔 KMh֠+զ]O-#DWp B+@+m:]J%UKWHWJnH@tޏ Z 6"FƲtu8tʼnj؆)Sv*H#ZކXۧ:5+wwoܽA͢WgZTH"u ˂ Z28@! BDWX ]\Y(t% t(9-] ])"p~]!`B*B.gDiEKWHW8^. CWWrR4ttu8tqAM5Е!` 6w(e,D44 W;e<B QtʮPHe-=zpj=JؼJ۰ɠ]lKWv=V]`IL0tp (lOWi[zb2+\]n:]!\.B+D/Wz(iS=y>R"j`bpt)f)4׀-Θ9-r#\;ѲƯ!Jޮ"kPFAiOx8WP ѪOV-] ]t  ]!\ ]Z +DZ:D R3VBCWW  ]Zt(mKWOY다ؽ= Ww!bMj?5Q6l+#UyoKWt= ]!`K+*$!<Bt(9ie%+l .աt(nTP'cXD)x)RYlzcjM#GEW*QMiDieKӇHӠ"(p*Nde_o='Í0_  { zQT{~/f rZ ~c6~<׻>:l-PB]`S":sG?.~>sa*'@NnB)4ݎ6>+[778Nq:oUrC='.p E""]o^""J[xQHwʬsoϰZ$h F-׃Q8׹zǯ &8vW;Y|g|MU_#cvq2Mwᰜc?+l9tXR>/t>[vKe?K6u]w'\5^X0{/~OAKn=3cQ3] Va[ܶC@pؙ &.7G1"|H Ǜ3Y}Kuf_$'p96OH:8hc[iv߯ 鼐^l9%7Vm<~eNIvy"RR5%fד5_Y֚|T2bopL*1SwڭY#=(h>UOSn.|zK W+f9H}tikWx[(6r@%s2:Ja"G( p$'p->8pgκeb7ExEn֜'c'Zk`0ԄK@UG.&Jk.2a"CQJѰzi/m!.h-˹+`%2KRxTv"'&g?:t k@n]^d6'n&w ڗQh/: ڔ"#57>#ghd&Q}~W#_Q\W~Ym˭$ȑyƩ`\|/8:!jeȨB*Rc$5-L3`ۊwr6TEe2H\#*f>BYSQ4yiZ3Gq,@О)MX^1 bF)8[9P* fN>ƨ1d9Һ<8y8h9H/eHjAt`)_*bwhKvuBK"BhA4Bj|IއF;9%Trz.R|`A(܆Nq8f0_\6v.mh}H%ےfFX;2cw$VQx?>LDc{ ehy1@GpFgކɤʓaZu^Di[L,pgEZc&|0hex!h8Q u&py-:-N<5+HGkr_-KY6~TnUbYzGę5coy´rδ H伓UBBl4&0X&?[J@sXUawE"Ϋ蔧t ( (wr]y0#|UM$$t2 %2C&Q%Ĉw p!͸uΕ\Ors18p{c89"0i5t)ZE-rM>Nf.O> vLFnN#0)DSPk'-"C3sZzX:9$xAƋBhOZLW,oW10i-y BEI*H ,Q" 2KFWqwNJq?@~2M+sh="@ ˼dI#$H&K'BtԮ '5kOBH$G%J$zNI F2Y4&7@ B%7\o}2)~~Q.|q̭ `8U=TBN_;*-Uü}M̕#\l+m_q\D W2Gfy#S@MF /HVSn6s?0O8|B;UDNɍW8np^4ՕEk|ޅHU-p8_ ((,n6Q'Seo gh>V0>OmhYW!Fg~cb~|pf[>ž"}iYkC8\LK|!"q$ٔՠǎ63v<7&Ҡ3Фn &5RIbImEEVuUok<Wt9ki &*/]Ye[hP0X$% DMg4gwzFGH^ cN|wA赓<$" Zxa}]D^\3uQ9mz)I"i%Y4Ade9AʽGxAD'mnrQ8N) (O,&UЊ:/!H')GRp,i31"/i&%В @I-2FpE*d8:| UIT4$|D< &eF'e]jl!T[<)cSdF9QD({H :zV :S'cHyC~O kpWۂQ.D@F Qp`fLKy͈8_^T,oNzNHAuݻL {{t2"ċ8ݰFTߨ!CiJqt_5 R? ))>O3Pj ©jleaxw^_冚I%ĊkƸEoC86wް䃭OV Ie 3W(f V~cnx:n> ~ttwmT^agEiSP,™ 'UN(dvZ )!F!Hc ]fBrc.OR@ %' HrkQ'N'iv_7j8EU;;좪eYAr/^:l:؍\2>9TJY ^ F),3>ɩ%"K 0XZ8LYdOPS-ݍF q|Ş((&ioV O}$\D10zy@ǃ@O4H'Yb0* #)i @E*D "9.`:4z }-'2Jw>Ju'c@-?tVtv+ЁZZJ"B8K{E锨G#, 55 HMm@;in3G z#J3({s,m'^okJ7vM}=7ʏSci};9ng>~Uh}GNŪvŮwu`S-~Z 3/6 odYW\fX AچkL;̓yZ4~0<p]*: ?z[m.⻟ɏnfp(r^Op>^GNQ[#ײGׯ?ZJRB#Q{s2E8R[A+PaVrV٩ r`?{FFaa 8fp3ff]"KZN9~nejrY$X*jAv2FA /J'#<|NBg^G6 G3ɋiD೥򟠲{l[ZM''Yo 5Z4;2~gz9U{/h.g9RwoF [sG@n,Ɨ9\aS@=^&En~PJ@6RMS).'?MuƔ9i+!T͛EG}=^WZe4א+#{Rr|md yYNGzA%y EA ڨAC1Vz#0|3rn?kmD,uȆ |>j>DKH%*ޘ{cN9-.aEFя,:`7|G#I%~z<6hE aO dLL"OԚ7A3+f}o~&_-=]b{RG/mO5sm6,pOHQ9׃!ůG]G%>WƧ4SSY%nsZ LP #_yAXdѼoui\Q-KvٵT AČ3pZ4ig}r><ĝ;`@# )3$>+ef|X2R},SL̾C% yA5ڿūWϊ&ghvAV,!.q< >{iT3Hޅbr.d9uvse陾ytISif/|b5al=;Jލ=bqEaDh qmvf5GBmre/qB..nm_?ѣ|.4c/˯v,WˤL;ˤ/ˏf޾2\sl7/36w;Q,P;N'ޅ旫ݨﷵɎ&!j'5'<{at4\2O;R;甼 VWUK7?NjW֫%O4|i@h<-&D6N؟2Bw9r٭n=4:aRal4P-Lj`Z~uQ{%MR8Jb*snb`B>䛺pz:S Ýn>7pPq85E{-hkeGЭ+5/?ppS"Wfb}bMd[уGBZ}euqD"XBt7VZ!Cz@oPzJ>Djdbo^Pcp[* 1Z(OJg 2O> _?k@:Kْ -\)op^SZn3Cݙ&޾Ŵnӯ$9g[*H sLudߐ2kov=vټyQuKw6$ÁwyJ"ECNceí HD^ t$'p9uz0?g͏!rnvnWk&S-N{%2\n`xD h"&K":uҚ 6F-B*ED:(BTN*W8|1-^ /Ҝ \lj:#v^C&b711AL +-|uq5n}'ǻtnϏnozR\4MCn%A$3N5LC|5jeȨB*Rc%3-g)Im'1M*,AąچT14shD)t*g$5Lkϕ4aI y3D̂dQ'3rZĬf&F!Kۈ@9e% ECẂtFz)CbVEKlb~dW,X  4u)NNBhIDѤ"5@9iZa:NZO}ˮN?F99%\H86Q# Qbm5TjݝkWɺkz:禮\%Nv0*P2qnArb"Ψېp2Ry]"L^ں mrQu(m!L&p qqч`AHN u&p\y--^<[tt!ݱt zYeܰ=n_)Wgqq<!8.mD$q5?$Wng-rEwN#0)DSPk'-"C3sZzX"sI^0r@34sV< u9CX!jp(bJAwWy  FЄV&kº ;\G 0! =pRHX"'NB&<y_ #8Uf1 |D=LN0)F4{h^.Fvy~r~U`,9w_4of̮8x8o1C=\}[ۓ{:o놵wYk7E6h\O$Z|<^'Z98L7 N^|mUT+0ܱDxrӗTMb_ר K! ,Y\~y1UW˝AVCG8Gys5sPaDMk q0jap"O_|u?~oO)3/7/P2s@$}7v_]s#Zv9لoӯr-fTۥPXo!k-vtP@Օ|u)ncǢ{45($DE%;I|/dq62CD H)%AKIm.xf#xoM"IkAc1II)r%zFTDuR[@>PC'()Cx_b)#/<"/#E4hRGPJէYɐ*9T>Ok'yHSE8h )xW.tPщ|f^rrm!R"1DҀJhhǠ3rn}sb<>N  OR&9D(hjV^ Fp'Qh*hEs]Ñ Ĝm&@0%MjDyRKB#8BD"2>[Po$*A{@QCIy c~D< &eF'e]jT[8Ô1qBY("C ke -+ӄh 4={Y{dOm-A;Kw H7įu\Sz|]Dwtr1L]\skĀbie09 '|kܞ$ޱޝoupUV}1JEɍ-N,h`୍#|ɨh:s;k;ļvH^w¸p\CƗXoè{Y?~?B}xT׿\sݛzm雵}a4ʱ| 9[焁|: ͟fn{FiN&Mќ2Z`E\ ! އ7QzDa.{ 0?񗞚l9IYV"klۦ so-zbKlA_3tO? }c0㼤 `GXπNPn;$lk*D9E**X673ǯ]ɟ> G?;Zt(f'r{ޮMʥHx4FӸ j٠+5:h=˻Svk\ʓs.Yo+m.~D"'&6 !٠LfL lhto Yl+k2 z]DAv i4'1L58F8sNzM`hgk'Jut60MuG+x07q-_g[Cql:DαK+gVTHNF}+ Hod22Npmv_?~1G<ښ&3{P/(/Ҹ v|2x*ָV)|Ƶ[Ņ*5Ox<˗jprd#i[ `}Cfqb('_&a$Z]rT;ꦃ Bѧ=XνnWWJ;._ji-vNHz< &s膱_|)DWoiT_ZPY+ЯS[Iv nJ'8z|fnb9|PE0&jb>s^ˬlWaQrxWiq߶ܟ9L!u ,%& 4]Os. D0K\GAIΈ4Jg9&oe@E`:L<ڃ: mrJ7Dd%0li8l8} W3X_dǠjk&ӌ>sc_>);+o*eg{Cl l4ĉQ9u08r,N7LeK$'Fwt%zsپȖ628VފWv{'JXZqCq4Xd6&In"#Jьd3k,Ppi`pՈcE5Q8ZT ~1H. BGe&@&h!g1A6;r 8M(rnS& Ȣ`;X+\$JKƦƶ1ZB!{;⌂>4c4`2SQU6S{.-0s L7̰NT{>&if|?8vWQ@UE=GSΌ'X:ERb\{/\Dk_V:d I3L*P j4!13ˌ3L÷mg{t.Ɂ=#K lJ=ϹS@M>bׇ,N_mtr{Y(%: Ts%KQY>/DTc)T)sZxL3$D:mLrygs`݈wrv Hc%pIRX$a3(?F̠h$00rZHd=ԈRQ 6Pd4mXNM9YΨM _Cꈀ&$\BE9"^!d1-.-C`H:bR։@"Zk NCtA@ !IK+'Vfǝ՜妃.N⤥o͝M#L( ᘥ[SL Ж2. 'gL1]8Q0}z^mk@rF32|?}V]ilXvՁsi!߃gWm_lF- vKcyZ'0Dcy[]ns7߹w;w;o s]ȮAd k5F\!rˌ&Jg4 ks4S"SF+.QK%ֱqٷ ۟(V:4V׸ HIM4;kHE>Ӓ)E 0ó5_%s,@:kRe{ZVntg\ rb\S>jFd: m /l{RMQk@xeSK:3uI D9>K*Xc|0|s~tZYU=V+P=(_bĄGLfq~:4-C/W(dI6KK:ZvcJ  k鬊5Ss1ږ5~Y%10 4f1crH8+ʊNCf292-c1{^hSgG"~nz9= dK3E繳a-΁_ u[? R&Lϲ7+c?;^߲עOJ+nA9&}kB}V8 i /RBjt*(OXVhuZ| *)vfi٢ִH(/7/9n t1/խw~QduJW| B XߦV=@!ο߄׽MVW7w[x&K{a!3 ö<,Ie.>|&ލSq?Lz4 `PYD(A)}O>[$Ք*ϷSI^Ao _HAJ1oMB/Dr ͭ$~_2oa8;K.'pՂuݕa"tU C%]o6Jg%s1;ymx2:fܺ͞Fw{٠T!kEUU}]ʋc~9 yx4T,̊..dY@6(@|&z^mxJS(+j)Z2ѐW[&0nTZxu=됐_+Լ/58!ӨPؠC֕n+%[ 6Sz t`<`(Q9>/KI}3gOpD.Q:F[f/coܝ Y\㺨/4. [u4s0̦bEɉýnN9bm))7xw@`;Թ ߔvYUؙx:לר$m+f动v&Tnkc9$]Iq%7dʋV"&S^ʘR^E,W{gocޅ/[f:Ec۽: XQKnCR+R:QuӉhy:<_E8!vKY$j(t];PP+]Sfw2G EيZ}7 Kxhd U6C~Cz6_ dfLȫ;b 0Nqf1'8\ 'Y[t1rcj2?l&5Bb"n1^.5{jw7fvCBΙSXRfsh6'5]R+݃Iې.YW]sfC`ha'r3\0xy,}Ƹrd+T\}-:s-F#~@h]|Q< /g6-`f٪8" e"H7g#cMcA'))%""% keq;н:/#ap) Gg򉷳=J_BnЗϮW2ݪ_ ӟ\WP//6KJKN]YV '0K΁n9$5S;Y Z)y2Kn!G'.E% dn.A׼G`y wD@4:&(fEw  Ay*g]1g(`ܹߟ]MǣaHcr]DH#H.#:#q=RF@;? O ]2MuczDp?cI_v7#"Xx圓` ]CD"uHʲr3Cr$qDIJ~tWTUWU]Iт%"BLU~ld"J2r J0R3>ڈ 2SE傖ekR8 g;´?ָH:4.Q0+ nđ`8ow* yGaӇ̌<|P41ԨB@PEu^ Q>w\& #; \[70LS&\'6 u *܌L~*nyO gCu쬹p6VÁ1`ӳzf^l+HTƣfզ10bja!g=1nHw7tvf >ƕ~H󴊏GÛDO}.'ug=|]vU"*4Ma.!cĥ")C|qۺQQPCU~*~nn oإ¥&YN SR'5^z0-\DE+ U^q0onpyB7P'û?~ɻ?`N`|Y}xL6&du5!}u ͻ=k稛 L!wbB.4|J1Ӆ-F Wxvu!Ńm[iE'ia}N8E9/HRÑZkd!j."F΃ bP1F-6;I/mk?w)0/mqM :ȵ IsNa&%XIGhåf1.o3 ٷm@Co~_-1(х$MZcg&S*I~9N`.˜$ɰ[)r}q2 6r;&ps01|YL@li)I[V6FݞwɚP2zEx#d+8>jK ^Ȍ \%5sf"Ha/,Y)] UI[fɿW;뀪jD`>OP fh/#XHHLBDN>t0끼~EY13Su4H&$!7B=r`iLv/Ēe,W$cZad`(QHLd$&A:łKs`&9bC:YҲeIۦmA(m#4 R{PܬW{Jw_egcng+ 2 diXjsu){%a*NWM]+fK;ƆAK'bTx9uz7#u%T>~D!J-FӪXzİJ'U\Wj_ߧ>YxRS҃T҂%]y#eѥs? ޠp<d|p mpdD JU( *r) eЁEl M9i4BG诣*ZLD]lQQCIZ+`ͪ>+Ȳ:߸ ]R,_TB{ZuL #8')s"8')sҎٜ`蠢tTK8aiB[9h6ߴ|Οw;٨NR6$e:igjsVh"AR+ BXpDL#Zc*+J޲3>;6h#ش+A)0&qMf~/w߿)to߀_m)QJ|;. Hgo>m@Q.n a7DK.D-c;QNȜsq3Kζl\׷ϧ_Ѱ\&!kJR>అ)Pty50i(WgsYTjaVc*9pͨW-V:؄d[ UxnX)%㆗ZV+"26o6%Wݎ7ZNC?Op|zl%.L&3_b*G+!C~I``{>g>|rc=|P9d:[nf<['lwQ`Z,M1xܤ> ]w:n:e1<;̝dRs%h7Ӥ$M/!Keeg\!Y0``񬳄v+ Ve`(=^0ucW>#IKjf'L"{ZmqI77iJ;Znܝxt9(>@RRI5f7?@Ll$4G{tMH `DPIjpsQ5Og5Ec6-&H̴fU2847t4_&;J:чuZ_ԪuRÇ$=ya{Sइ7 _Gހ] r}߅xu!a3A#3rL1¶V lxLs>]j[Ƣc\\)1 Œ9K˭.#2<Ub?qXWЉn"uH5A0ɵ r B($<3@c'RFXp4'>b02ss$C7ջ̿٢T~~U쁡 p쪚{gaTNK5 >_?(o$WmYO_G[ctqӌ@kY֍^^^,y,d=ی nY5)AL[#Vl꽒'G[3`n Ƈ;/A!YkGjaR[!Jm|,lw1ͻʼHq\`lJa^iGjd~sV?{Ƒ_naᓽa.pwOgTW=/!):4zf]^ Ɵc.*5X@U`uc赐7`us^] 5h=TKkGVj6h]tT̃S}A9c8V#s:tTTxK78tJ}yyzaŘɆT#CWx+6r)hez.1z(:eɸwUEڳ{;Q<$'} U٠=]}1,F)i1EVЅK߼F{77*YsU*FIgCu/o_t6R4N`/]9L0ÛR,Ŷ!^%<@rFϵ,wJÃ&5W 6e`}J18bz Xns#W*i?a= g[;s;XHцKYW8OÅLL {cEyQ\\QR"O93`2Jw sd.r"eًN0 VyaV@i6(҄HXZ,3BR n[;g CD=/iw\2֔eI']=Ѫ_z o &@{.Z0uf䙟e:S,WbIB_.()f2Υ#Hd,E< &t/DTc)T)sZxL3$D.)3֝pNpi,D8Y*K헃X$14z %S>3(* %B# b^"Qu3C*Ygg_ Y]` k[UXKc$Kmy.X!s!0$) ;@,[JP wL8 QpH6Hl Dp22&9bNk^9 ,0#:ZZn|u~Rü/Lp ,YE;21"-e\#N*ϘbTDpњVhbZtHٲoYlL-Ut `@[bP%XNq%VI8¢!R6i.4Wt/'qk;BnK[ C" 3$Of<seQjsi , Aɭ`)@uꭰ^/y)ʞvEtv2ɕV )`qD(Ǩ_rNƥkb4 wűB';gҳT6PkBKշܙʮVpݣQ-Ҫю#ݼuԻ' % svD^rÜ9Q [cI"mCII$emDbk&N$@@a xT (,7A35EQ=#y wjQ1G Em:2,"1rbrX+?"tlu&})L"4uIK^by6`ߣݑX L> 3JU讒8UZl1k 1""SSJébH0Ʋ ui)euk9aLGIe{HM;Q!(8v)&8 NOp&]g%R5YC)χS3#5bLq!@D5o bp 0 ƒmj8. å7;e*"SD+d4&jci2|UFTx9X mj0X(h,ʰ3TJϽ%Y1Dzm{F<6 .5=X~-T 6k+ERb~5ɕ /zOARDAoM@XyϗI _Wj*¦G~wsYI&pwY<ܜOB*3nx`9A+Ed"&]aҬzo/|pvw~A7c9GmG*Wpr܀r\ mOטb{LwjvƆ[i p{VjetˎQp٢=h'ͨ{E>)`1"5&u/:ع7^D-4{3>ct/ ޼3hƓ@[pWk7`']9hFT[fz1BF`)~S()k-7'޸k%so U5ͪۀY֌|GfUGՃ3Vٶ6)/m@T56eV9w{)'^Z*MW׹ ޸LUiUQvME.[Eh ?^nX{3۲]~ehR(*T.*ɺ͟/L=7\"ˆ,pK21|­e>lװɅOkC}_~K قiIt*8f6Km1Ij="̆fr1KD=r}zFՑYJ;ʠRݹI`B%+ؤ|L.j`9ՔjM̅06rz6eKm ,|6kim:>m͍:6,.,t]&KT婡 ?ˑp腞]iG8;2*%UnJpQ⸺8Oώjdp"WΚ<"c7b;o|ř6:IӜf?!4}*φ<TjQmIʗq +_#P; \B#L9AǟSwlFpmGcJջ@0ƥNFxUX.*,WcBp[]T}b. 8 m;SmgfΜoZB$VDFn`z  j6 N@O\(2k3nU_g),Bu?o/^NƗՃYl0i6Y;;$qq$plU1fuǹhiH4N3K*e} ;v1iAtuolqRJYEڶVR(UC矃KCOF!, |AyhPQ=׈bXJ?f?N~{z\׳0TKj(qgI{4^Cj,J.YJ\b_\o "/^z::%^E60Y`=x0?>5!]M SSVwjWWwyy^,ݵ/D"Ą#Aqm1$]xGd =蟷N8E9/HRÑZˆaT1x:ORǥcZIv.y>?C-:;tdk8`#+L >bKpiTG_&7ûu4 PINwgO7&zzQx OQ)|Oa۬,70)@ xvߍr^F|O3ԛK/R+?zp~ nߝVٖgym%oUUaso@jaKCyJaʵ*gD9C#fxÕȦ#|j5f7>>:&4[][oG+@ڪ@,6laAPW1Ejx,&MRlERfuT}VB5(44bF"QYJYTŔT) :7 b;L!4 {"Q9{D%4RAE騖^1 HE,G4 mXHMOu߷F,z_oPW,&c!ZTDVA S+VinX4BktQD W# jPQ$r[hHaW+L,bCH̙8hq;Ktpd|C8c0f+x*w}IKvm&x` '#rXGALpAɥS.rBh@p+j$HRm$>*S |QoOUAcx<Dc*X#V $`Xcxa B,Y6C8}CCvU) Ue4䓲*"fJEl(o"Xy?kŸ*q<1d2h% `P4ʰ3TJϽ%) "#^ca,5OE;^ڕ;vh[+/ngPqZB 񕱃/':` 5\p8sp `kHFI%jޑqi8wwc_3à_)&T 5ܰJH%+ "bL܌z;B?^yi'NZO_r"qɻ5UkZm|̋k^BFw=0ݬ>BzztW X ")w}b5UѾ}ѿ@%"f_b< [c_Hޣ%i5U_d0Co7Gy `EtI< EX#¹VsƳk# L8f/)ո>|դgB}$k?A hh{2_p"W>XPq, *k6UOa:k=nI:q Ս4q %^6GMn} >U^aI;U/w ?5wy18&}F<'=^3 L^5E_ t:*3-zSl ZŒ/;if&v3Yr'U/;.~c%)y k-3܎Anך"m9 Ƙc|09ztzɱ$ rJÃ&v6 ?}2azݳkQnoU#ˤYlw;KK1Tzv }&{ YENky]Z[cR;R#Gjd~sVk/;^sXe;Ÿ೛m>}`[~`FTKӏd[LgǏÙ+nD5PkECU@șpsT~ *TZTgJ@tzAa"r)& l2Y 1`lHQ(r K?3U9^ѳ,@Q+l~]$8I9DŽRn&x92J#"KrbrX+/B"ɳb?OBqoIQ歾UEEvULUz-Owt+ѩvqf>U7Ry&R,rl,$0p KgiɵT=qe,:uu? )np0sHW[]E0bePx΍uӬ_[\(wp)BSϘ8P=0#,RFX/^N[&m|${o1s}:5f_mܿ?nKugҹk mo5b"{&SV <ɶ̮ ߿kG3\mC~2rڄ}?Ov6[LiӕO?4[li\KpQL 3Ko)a8tV[;>4ݻZ.Z\HjŸ/ B'ixpoBG7g!,-UOAX$rRk:hf,jywpd|1^0M֢15g#0Ysb*ӏl|',=W?tCZ=N` V֪C+KFDoAs_ S RN^ZR^fką"RHf 3qJ8#2(̛ !UJdmN%zydcOPz!X{`K)҂e$#e#uY32~mlutzs-s߿Ό|J@FCE(qS #2ة3MuS,y,mͷ]PoSCnvoGx< 39b7"vs&w `%ڬ:;xM&b߃!fاd^P '}voȸѫN7[׽ q5< :YhT&Z\QI|rdW ~Yo8dgB5We,^o~Ͳ\MOe|cИQ]f9U" {Rz *f1tcE3BU$ ! ~ )X ULHKGkAX>[Gٺ~\~z&/7Ů{Ʒswn/VY`B<UX-_VcB:(^ˈiDک iȑwg&BTA%n|=p 'H B`#m@KX˜V!!(3uS/ H[XA`IrY$53J/XӲ O]5z*rX`}͢ZUW#ޜ>-'g:'SdpG*!~!]]`,pNF#?Tor1}mho:1hLYm}u"_D@@o/몵It:yqXGo:}GWklҤg^"`xYME⦷ M to >AKuT\;L[cFn߯V5pY;YYy>`nY93 (^_CO`Jb'CzHͭ6 ƒϘd&i%L=nT]rN mludhb6|ᄌ_-WC»Њ=3jӇGƳcn>7]n~҄{R P} bi5<o;b 5D5QI<̗=/Z;yU5a&C;)K:Mq*YŁreZԬ8L oon%h+/~7!=S(v%r,<eB_5NwLG\b4`*S}ql ;JjK SVe{OWtZ3=Ց?ߊ"(\\QR"O93`2Jw sd.r"Lf-N()ya`"ZA۠FH"I`vęeFHug^L7yB20dU SmQ d1u8q{W$U%=sLbrr@I(:7DZgi0؇pJD5BaM^88C@ 7r? : npi,D8Y*KER AĤtsi B83(* %B# 9`R]LHd0=Ԉ h BJ(2fYW-~{6jJx,//IK kmHH~ 0>ǻncqguHl+W=|EF"e p1U_U|Azd ZxTƄR1 gcF&y*Ч |Vf7[-##3p# L0At~J6,2B`$^VP+ Tn[&#!a cA2K3>c9.HPZK QDcb5u;]_c6ntx=L"Wg$جH\SfJ[m?oL#ZBci)3Z'hܢ&! q"i2RJVB"mEDoclK=ɮ(_o"kʷzLO!F4$! y%=y9-Y-0Yգ?XuI]{ O`=Q#]v,zvHyBBM6yd߯$h]$,eګ,J5LXW.N5oHj:EiM|LrDѨ9a<,ԃzpzpv2VL18svF'+8$e4v0He4o;Cp:h^uG0I[ 4Sw͘|2&i&8lRStd,&"U6!)q_h{Z0Gx~NN$2kt|m9Ehy%{V~5ߵ]nz, ུ $C=2A?\ !fS: {_\"mŬ u `S5T ;_' ; 4ߦI }*[_ѧP^u|;7ץRc{I'vbc0?T&B`,^Vok$\{e2v-#֯GNQtP|l].y y'o3M1OFu<|ߌH'>Rg~ݵ~_o{z 첗I/i&gZ"NϺ5b; xv0W6't%tӯ@,:w.]d믝+tWq:wjنAfznak7?0o`gy9L5м|S3}|ˈW`C^ M-zkH rl]%#q3[_wN {w ,ge-}xZL<20FRr&s9x'=ĺ:F)>BW/F%VwΖ[n=[7uu.;ycצdpw/<dAb.n|Lsru;GxBmo/|yqDg;/]y49M;lr8hC>V fۚ#c6_=* l\l::}8+ȭ֒T6hP`9}/:wݞ}E$qE).eHM;O3cRlR‚29Az/ƒn;bv=sogs4ȅ:5<ּ -I ݺ}ԵcEG[BG;K]~@CHnB|-td\DD ~lH g B=SzZ= YNdR dhRH*ŅG<&  \g@E"Ykc!'6DϐZ]DnhnYx+ < %t"Xϖ<>҇VJZC+}hũUs+}hճJZC+}h>҇V'UJZC+}hVC+}h>҇VКlAF@%鶕 DVJY "+AdEZqKq2qໜk|wҗԗ_BCڭsN5F|%2Tǜ=F!hy ^yܗim[tdUʶV*Ze[{lkQ؞ons^NHb\lrU.V9*mU:lՓ)mU੔rJ/Vit\ԑ:RGVJ]xj Dg.pE8 P$,^ԂJ163t:*do{|UĻՀ\.;y51||>+!:Reɬ%Ά,X.̊q+JlT9C e \$8[oaK r q<'m_0Un!D[zDzrGNW?X$C<r ycX &HTFELF4ݢsI%YmO BbPpv..K"yb(Q2G0Vn#r#͑y4=)jH ]ɝc'P$G8i{Eh< (&9 Qޡ (132`B(AeN%fV̪*[0Vii#Kd.J-H8VL?BAlH r $.01֚si"Hwa2F*L` Rf%.9lArY |O9d&-r $%4362i*5NN}8hgN,"NQG %hO`#_|k6B1&2olfylfrp9̞.DxΏi 5W\<\b3.;e{GFKe L\;;7jHrtXR<Df,^]Dž\t7]мYa!s-#05fSͩHwE؞[N98?;9]i.?L&Q10.\9i%*#`5+#]L//!F :=;e`vMgugљxG3+.hY"Kp[ג-u͈eQyK*p8`ŲDO.<>9>ipƭmU[r]j44)Q}kHiX}4lF)Fcf4 _wS`A0Cm||0x<#:|1y% iIO8.Paj+ h|2M84qvW_+sP5I꿦C<NO7=)d"_?ū\×@ c!aX.݅_ׁMϛ7-D_MS{󦥅MV}z6ᛴv~]nܥRǓԙ+ޕŽGT& *;qw'e?Z_Ÿ@"v)R,&͵cFzBvΕ,tUI$J.Gv9ggШPǎ6̭ј[gN)ztrNYr& #[RJh>c<nX+ #s!Ǻ˦Ny&]v\2^fE,(h+tF^ۍD[V".*+;(=4ZVH%H\!U*8;~&>ȰJ:)ZN` d%V;ȹtƵQD"do6*s@BVR(f7 Sd0ς.Y`9yu ',tbw{Hİn`yBPHz :bn bҗWLz8uodNS֪#(2U!k Jк%A[Q7U^KzSy`+e:G\9sY& qMFTCyR⾁פ<\!yWhtoWJ0q6mˁ V R= ֕|'SA< [ԮW^J)je2A<ăy+kGۏ?=8H%Xk[@/ o5gOTi*JLd 'r"ZCژIfTn17 9z:3K< VD@(J nyoݹ=O6q<ڭIYT~ldj7,88 |w&ß/|ȆE9+o) HR'lQIn%\U^wl~#Yg!׋%D<1x?+(v%|[1 8ߜֈF\@rޢR,b~Mr_5kWm*D%-dPЋSd0^` I.zŏDIk6 t6$Y&$pZN&# ?֝ [r?_R;{t|]mo9+|[ܴ; ̇\Y`XlfbβLWdYVjR b5լfS|X,~ -v[.WУPڻUf(}/v\k #I0&"Y-$Ĕ-;"id,o@$52B XqjkThA)J \]O-q6#x%p<'i\ST,_ɳԂپgy? CcR&Rp8ONRɂTwePnʮ fV<$XQu7Z8*]o`'M.jUY2_|88|\5t3ۥm.ӎɛuɾ'⮵'xJmR$vLq`ͺs8y9&$rTc^I{ihDH>T\ NhsAq0Pt HC:)FϤ1tF^Hժ9Qs@"(Q\f' R>";\Q&)EoىnMmEQD}U4FXM%٭8B-6)G؆:4>~#d?6+JR9ICb@Ik 8u)qb>YmNm̊סkKt%6!ǣXEx6Eߏ4z{Ig%fl%9h L*/<(jJȇ6rDm؀%, !,!?iI)Nz)D%T{sm̧BxQYž:u0f4"Gg`T4e QF E=Y ϵXCXCZiDI#o\}/t&2QwCS6`w `jd9Tjdi {R(fj-_==i52(gR `EeVX^JUɅ0 d2u V/`g >]Չ{d,?TTe3Tay󙩍&WW_3i&Tu|Z{]P:L}=~؏zY;n{ΪҖ͕jWaeS ę0_ `:t9JH|ew2j52F6k22|ʶZbm^$«崃)=%pY+sɳڀWVC`]~<15WNJ7 Ӕ6mb<;lۆ˘6zz P:N:+wLVn)YUf)4rX.8 3uY. -w8sf`ƨWa:},h:,|wl6o6{ҙ6-|7"EKx*'t™2Qk Q~?!dX 9u,#KIu2C!b8(01WY\O\ei=yoBv{4W ~/#MWJSAO^ũ7药b]W)m{S, }A| d!IT("Gt~oP㳕*1 o~5x×Ûvb㽇t]LA+Mݬ%]R:`XiRb!Ҡ 9y5c,:uK9tie*ոmRGr/OG?~kC0\@E q)%K!BL1 !3 tfгxy*rsKιdtO-T3A ΕFZrFBiךZ VtUVT۶,C۹XKq::qc! k<)T2eZx]ІH8ygLD1'ӊpp*NS;gβ1fw 0U}wF,'k-߮û~<0C>ovPL]<)xmwoz}p=1X| eoՑblV^fa9R ~Zth3TڥNDj: P ,}rBGȄ422#0OiW4%Hvwo/ߋUZ(,c_NH<(Z #2+Fd>!6Qlɤ$AzB2(czfl $ TR$'ă:URh'$&=ChGzZ /{02wbk6Ϗ~Ck56ojzVmQ҃j]Χ䣣D1%7N1gxD"Dz^I;:gchˀN}EΖX'8Qfy8*.jWn/mwe.Z; :iT$eN;tШcR%EiXJ8fI{LmR|AHLr0h~Xkb2xt"UJTqΡny=C""ZBN"ԣnWbP\9*q'&Q5"m Zt*tRS6@ $EZ=ҏyB@qCI$L&h}RJ; eu)&)GDJC#OnXKΉpihں#X^_nr5"1[!@qدA.eadD`8nRdCm?~4$xcO\馩܍,Glaep(Y|8zg7Yp4J^gkӳ*ԃ&BF!$w,P}2UTƱ"s8bXt)'1͸권}*~v>WE7;J6sPVA(+pt3<;+RkU^?boP_.| A'4?{Ƿ~@wǟ<'|/sU@  $D%\$taf] >>"AMᵁm4'aL"G֐Jm!H ex<uZYi*6hhM'Ȗa OVkk"vgiȳ+ʍYKKV.up](jC2s%>P:deZqJM3ı%J 8%f: IKtOm tH9xni $2$O8 H=gwO؆@H`ˡ*-Dsu|FNq9)*Y &@YgD qٸxcā\z -2E8x˔ "8g*BY0a06jo^k-+< :ԅ_)hSpW8M[σ7A9JJ80%w&]ii2 1ltWMU8j$*Ttȁ1G=!K9 !zΜR8 zThV#&Sǂiϩ"hgbAƩL@6E!Hi9bҚ8[ΑHFïTK,q>LxÖl; jƇM6[`j^{vZiH HvV Gљ)1 (B[*wEN_o@$521ldhښ"TuTx,*yb^hK^ɳIjrN_]15eOQykγ|R GzYR_rBC9*m8 ep8 \,˿R]6@c0Hv7Ld ,i %G%ucVIU:TVE|d=TdEѦ!WVfiYF1ڷF k=9]s AHRVa S Z D佖豉Sm^ 89%OoGG'b :5CR=E.(qCvt $*Ű[)rʝDM./E$US>K7DatH☁ 3P  D96:IXGt ¾q410BB2PPg($^2!YobA% 9Pg HuAc-.zJ)퐔)DFԻChKSwPwEyxQYl?s-Nlr:i3?Wgu)B)9ħ7P]^)o?}|Q(/]O~0/[1G?q)B}L$.Gߏ?~ 5^ms$ g0(!diܸR:.3?y:xWPÃh$Cvh)Y"к|{JBߪMo˻ڞYT=:P=*Y5ud͋jZ&t$LgQQ1k ÏY7ҥ]͆ ޫ 1{&sP&/TV=r_=c_FrRz"Eۻi!ajaJk!XN s}HC6D3& ścqKREq|8Ij%o/5O.ijF`eEh$38䳈6;X ?Vw(d0mmz`5ҖWڮdp~s$^MO7gI7XB{&CD{R`LDX,-,xGI4 G${Y%NK$|ї0w۵0J҂I5Ҳї9/z<,2PfO=a0W,B[Ąőbc?1M,ZPzWՅItP]@@v$mqN y1vwassr+.ӭ:Ja)6,_qGRbIUhq SZ|,o' FŖ1n l;:Lwl٠v`9 ݈1*<ǡy0Q# !;ݪJ :хvk_/վkdh[.w(oJ*oG|<}Q^wZ/xsˊ2_$]NB@楀XeQ+nq;qAH96y,![nuIY+Cmźsvm+b$.vk|K6Kz &&VQR1qHט@1WL#,8ZIuqK0J̓W?nCЗ\ކl8NRⲣl 2e-6}Ւmz7ݍ_G- siø);%paO9vtʤ`ߪdi3Ks^+TiV{.{UDK-O/hx!rF9ȉt1#"ᷠ5}4<%d.{e}KX)"Dma0㹎N$JŲ4Jg9ˀ*`%:-H|0Ӊ8I3^P~;wnベ3svgS_ D~?[`H@ʦʶ"FniN2L0\t;S=g(X 9=vBh4z5^r?53Yb͝umoߜȾbMlsTowϸ 7I.S+R3Eyb&zP$썛7^F^uGv FvY4a#A29HysQ 9(H>c8[ xբܯ vIV։4-4Hoŝuσ7(g Jsd|p թѝ;Zᧂ{7zH~8TDVA Ǖf!.ܰhcZ!fT0۩'``g]."_?pI얡k)լr#hUfn.0IK~\L:&*Ϋfֺzma7m7#?u"yf|kZgxtyփo؛p?`dXlCfM̓/5-Ѫx=)&LW+Тz4DmsfZ8m< ,7ȕ`ڭ٬sY>un6۬/݁;/é͋hEyV@\戂yʙKQJ̼Sk%sHZ鄒1 &*P ʂ $0Yf3ѝ݌N\ɟH@9ħ7$`է/\/x)~E؊<)܏'0,/ yAKamJbbv] D4KfXY)זU*=וuVAY$Ce) >h[Gueo.U<ߖwM% h{@ 5{ZuzT~}Zɪ떏efE5YU3MFE|a?fU 7(2 xJ l25- 22s~3]dKc1F1sɭ`)${zN :!$!Vzb [kp q(&)M2щn}JJ۲/KI^S`a_  N[l\:(0@rTa=:?¾ 9PMO3;]݅lQ_nÒ7" j:,..`fo% 5_$:9q #L>?QieOҊY=UsC~ݢb́7 Fq8v꧝]peٴ |ka!6g31~iH4N*0 v0bi`iqRJYEڶV|¤"9CC!ibKCϊcٸ:X7(COQ7gC尔~~ב~Pt+cPÓ0Hb(!gJ{4W"s%T*a09s[ ޵5+翢dָ4ny8l=RS[aX|H^oU{8$EȞuYr@L}Eo_~//._f=O^kǏ>z6 <no[+|cnmZ>joh=.1u Al#GTa#ń'q#OkQ#_+v)2"6jIHxA@"Sꌪ&PJ Iцt |Z" 1G  xv&FXcb!,2)] P(TxPgP8wcqΫ}5i]5/]&H)y,j7=7hB%:2v Ė~Wȶ% e4g4AA~_Mrl*zjT \4:bL@.PAAgv0B=p"F zɌR$^y9`Bo"f6էۛmdP@P{"[KdK])@&(aU5SQXeWYUvYoMS)B!cS2H$%C2H#$Riy3гg;,ԇ*o$_)K\cRI%\}D"Iw%a,)qdxc"IV#tlBT4sh0)@2؀xlJ= 6Q1xpUR=u|7%γ {ŰbuͧnZ`>t^Ec%]p콊hḞ`>7NSv*=rA^;ċ \vN7Aڇo/6(~@OtUFW_j~iqo"/{wɇIif}WNѳ_0ozIǬm1f ޴m~gm6Ԗq|Q!=tt4.dR9 ^67p&T}!ٹs5@6HSbu6d!)usb/rL9I+vsh9~u&QJ 2+4af0z,I˶=n~\g= = 1=aBV48䃛=F(oM ^EX[unQ[v$~XxeGJY.:VJh~aލzxM5"Yp&xdYE1H) 1O" =.8qAP_'yk5<{SEn!^]J0qlہ#4h])PyH%W<4ajCoR\ C~!^il=fݗH R*4ԱY;Uo ;䲽CMm*!eIADJ*{YD̨E~t$OҷKǎ͈D 2:ʚbIĞ!a֪D17#pevX!AY&_v4|8eQ{d3/E~#j|Siɷk JTٖb"OYD`D.Yɜw*ʧΐ3*ۄyWwg4xț~{ȁ( ^=+^v \(cT|=GH|9Fx;/M3Xn]P=XQ_7_cPȶH+DhӂrV+hKΠ$Bj}~TFkA T ).hPHk?Q,|d.4sd4ܱǷ!}Ho>Mx؋=ܘqܵB[=f-E=@} W -̛)zTHh#T9* Mmg)ɧ,U!z &GCI$SYR:L؊["0ﴳ>K_ګ1osOfO,gy2DΏo?]ĥCo<^z:[{a.$YS$ ]iգC9:+'[w*:nm)ZG#녚Ǻ%T%g>4OLrבܵ>sd,/.bRgfyu{xssz6eLa5#*SRZUt*+l @lTDv xྏr`Բ!cNH`6³&̄$U|k1a1(2blab#<XR T2e(IT0edJ+D  16riWag`MqDž:U.$gcŅ-&"TJpU! z- Z醎PT~DUQֶ 2g]rP)7H St4:D~N -4-cJؙRSԐk:T鮻Uߘow諭xP`G/1CMs5x5ajV&3?k\A$|`ҩzz\<4__>ki~xĺ{ 8^,l %Ff|s6d'MU#G7f)^)C#O<Sϖ>V{P;U"I*|?Fڶ<x nRnHRA|tv({8|miY^w_&vyEg/Pgaf ߟ>CJl><;ytBUVL*Sxrس@Qd<cՌ5Wc&~]MFkub'ҖDNFTkzIf9p 2+j68yNF6.h1PVVDsH ,rY\Z~lѤF݁7OϟlocMU-Kx TqXtʠ bXeas3 .4A0GPO?_ҹct%otch鯀hi{$3kC,*kd2dd DTe+l^f/M@t%w@RF[5D>!RNl:-H9~e`7$l+.0=JuX.~0Wyin?O3X]~-䢺g)@A=QU鱡#ELJ/P GM> ^As(UQjf'Ubpgoؙ&ETP&U4$BU6;+H0`Q_2m1Q!y#PQPkFΑ|?6[GilԚCt ѱ b~yuFPHVE21"7g#,В|/%V*pX2)AȂ8# 'Xt(GS74I!`IOlޱ;IQDcUv(ֿJ:^[2ZxT3 Z /2$jHD9;r[yJ]= 5"Cc#C"?9~r F˘eh !i)@=ڱTf;TpAϳXs!b&MoC 9.M@Y5d0&L6G&Xx/C$} . C4jA_iT5,ZZ=О5h<7nC:FγueVS1hAHɂ EtZA KWc`3b>HlE Uz2^΍)ОNY|yOWw%~JItU} QS?jep~m^>A^uaru%Vo\mr7ݮmlo|8?nn*m-?rzv+19VkHf՘]6_œ4P;(:YDDY(0ԙ !Ő!/]mq-Y_{zs-%!=Yϖ J$H %RuY(wjzjgb(/|2P sL)He, lٚN9GB{ %sg$e-Jz];"fub5c2|𳟘 Kc#!wn9!q@nOoy|7z歑v|뫇V=;)ǝ6_=01Fx;93S9l㯇{_,\[k{מRtṍe|s)Wl 4oo~ By7/-\w .DuKc,^䏮'!Oѳg?Y{4挷.iGʊEJRxat!iƠ(ۖޟѪ@#+D-.#!xN e,Z?Ç(hq$8l =Uտ@o -Z`6|{GE*dbvG7=:R96&LH[%WC i*!mJ}H7&7N/u4tj<8,d z(t$B(FL|obS[p,8Q!aaօ#6C(+AE:a% 9G2ԩwP("G_@`T" s.5 ЮkvFm-O>̂8;'>/Y-LhLFMԞRSdUVc1-J ZjZjzcW5.9gfK=XQϭj͐)V'O'#U1ƣQ< 6??A</-*>o8$5] u0jPt}WӕJ~jigS/g^kLhfkG }EcuMY$Ӹ)IdToX}XBf;/.\xZjTq >A{Zoq3d1D J&e;:"L2%" nLu/mu#aBZ+)gH,/| 1z+ mHu7ٲђ5fox-7.yCzp%PdIZXƜ :ŠIT\tIG+xv9Aŷ"yhƷ5YYDN6wI (FA)3 dJs^23non[u<:;nQ$1dTb `C(dEŠc =? ׈۟jkڪr>j9"ݣ0kXX$B瘲1FJzAV+K[2 0,Y fS-EA&JL%]DEP;mR[v˺,(P|_['_O?fIBm+ub1xF Kdge1IaNj<'!է a޴aGY x=eJքIuctl uZ qG4pvu4 wzeWDh崭ogyi$#|0j0VmfU^Ȧ<WbGóBO<=?597ݣ.&iMs5$`9&b慄ԁoqy9ɃQTq,`4\ 9bTCjzIE#6fN^kN6Txih|̧hS5̩̦x:Ӌw,$a募~yY?/o~|T͋z/?>~/kߌM$H3 ؍]ೡ~C+[Po `{Յ(YcIރs/*g Q*'*?SR`xSD3&6aAF&LAe&+;а1|vOsܢWߎcv_Htf8sTzyR=~^wnhRu'K'R!+.RCdh]\?O=|< n~sW?{7 GN3+bv " ظi2/vۛcdzn[j_õ%Cً6z[i Rv[JCPTc笞OxE2i jUGLն%KI}G9'f~|N};'~Њ=9hyPEc1"2#+P8]mSFV7:%oIk7 Gr \aј{ԢLODmN|@議l=b>k5cfd. 7@yޛP,l@ |û}K]BF:\^S:WeQcL'L :)L"NwVԻvtm丯W8pt3 !i&@fܸ/:{d|p;Ϭ{g`MFCTPB_QY"(&]l ڀ/iP\[gLL l."T[S^@ )*+I`SԪH];:#~gh|hr0$q̪vmbg?n]A\PHES SZE: (bJ̩ +bD--.Nw޾e?Mk`gG1!zeБ=tsވXo{AsxK2D|4lS7>jDP Ɉ~m#)H}@cIT3HFY-r SL%FWI('Ru1]@.Z2נLE5Ii?0Q8b;Əe8PltOq8w9OlÏmônn|yL ]N͞YAO ƟZsGW֐fdI0()?bQ:n[D/Զ䚹=Iel;IZFـiuȹ_⬯@ǣѴlB\;X`SvEoY>eQ] mzeoPt:\ԁX v+L;* dQ#_x? |}>\\ݾ]-F85Ů#PlΔpIs u` m$I&ؠGm\/Ez\J$}Unx8zݥz2 (Rrx".9Gk#_31R#9Pd"۩yAXj"iX(o_)2f=Vi΁4SVرy9h~;n㺯R[bWWWg; v/佃n8_J`lWqgJ߻盪p4YS?Nhk-Z"@: MAk-kLh#Nu qz;Nj{/(#o6QSv`{[ 7_F96q@H{a9CI1jB [aJJF8T2A{ cc]>@,N]_ףl/NTbb~j<,B1D3iF܉p[5_'z֧f:kՠ:00m:#\f~?TLOoEYerI9XgM6, >"C}û 0i3'#ݭE [I㳂"[6/xB\\V'$^0'Ku/+ݑUƿbxaZZ6N6zIm6fagْmCen=o^uI--)mVW+[GEGt~ ӁIhszsl86M j&RxrbV{C3lަ1Dq<^/k^43 .;BchkW$zl_@915w'--6e(M:`gXiR4pCdA7;)͹6F#,xݗ9iI@Лۛdh7vVZVvϸ"l꼮:!В =Od4M(b!Ka(!{0I ॏ#rR8Rtυr9ҧtى' ښ3O8p\DTQ%dD \ *R 0!(I4vE唗Z)}VHI9E !1"D6N+}۵Z#gdC;S:aMvd h_ )_ m#K=/' ^q8:yOyp%{ͻy{Ax3F4-bc0*(8сqg8s'L&%d/m mXhyhE&|[0pÃ^I=N K0V{3OA%1{hO<`SZ]C,k\dDؤzv H䰓Y@Blr`$&% x'U|2 YݳJB[)(4Z?>i^JǻXd/eec+Z+(>\iA|v٨zn~^ GWLOYv;?;?2xcڝC kH=ah+\HHHuH.půiNAB(cDG;N1Zwz8uٴ9Isf;|y'ZvC_kOJL.!((W@bp`dTe#ʥ>i&R )K$)Pnbkl+)ӕ.U_~EFUccf7tXn =kI3eCWz>e>,En]֭[`]R(wƱ]v>qY|f=/\hM34|oU3x6o8y4\HPq4F _fyΚϟ5va9mSeFOum]jU[7WD Wv:[P /6\weEHZND颶 @xڻڽ}ԣ&*c Ȟ&B Y&H`0tĢe=đ"BꐓlͰ XrM%Y4/ϩ5qbv[+c7>^q CH,ּK\c%IIOKA"ҐH$"KK$*<y,!D$mѮ%鄍H2xƒ4*$BPƹIBM*$ <#:j>1eJ*Dtmgk͐hY u  OZMR'> 'w$ϑHM˞7?(h ĻLl"VM}~yqV_M.>ۄC1KG/ !_}DGx7Q+apG{(ep,pѣ䬲kș?*sv^c7 U,)^E[-㽧$iwn[ԃ<7Kw2 C}zF,ϛ;o/ f5[Tx89xޕE_T$T{]>1r@n,nsRoƣ;;sGY9uTUVӮLרRW`u:"WBj)!]WWWWF]So]=`ylg㪫QqճTTwK]3jשBS%NH]!QW\NE]ej캺TBFu$YD}u ^躺侨9K]!L%# sŅx!ݰrvB] fH炟^߿)ƣi4h^4ψ3朱slj $?wKN_@뵧`FN Rz*j:SLtLF5Z@=uI56Yv@Q,E-Fø2}2c>Uw}Ǭ$”J^>]0NRK~go~[E M };l\ nn٘717 R6jqrW?}řHeJĔN8S&&zwR0}JKH0秳8k=4f*Tnomkdeı̔/ovfgRYy_o.iΊx3.Y$\VEO9Qv7u$FApi4)/C ~ӐZF7As^6Ԃa$5oLJL8Cs9qo8 oξS=x T@>yqcq+,RiU4RB @'Z2 HJ5&9SHΆ$68 V3B.i\,׫JN>o;t }msJJMd3BDN HWw63K#5wV|Q 2dnmh?<3T\Z(y2 EH.u* EZ]_(T.$ E_BbJylȻ'wZpHc#u%Z]BtRRC[;n5!K’qc`0K"9ߵ'址`@ґvmR!r&PP0@p&e6GC0Y@o;HklN.:h|yֆw+rcT[&0-ΕR߇5 B&wіǶEMs:e^$fy#'$g;@N  {Q fzmExgHlZYQ}Ɇ^T(A'ms٠֊=Y mD#7)6jk8ZQ`YieկbP}1>@A )Nr-704 {NY)\V_0Miŷ0ǃ,ALjM<(ÁDVs?hδ[#gbPcg}No_^CA(đh݌w ?ɬgKO>vn_/BtZ;5@+M8Il$;c E[&8#i7T$52J1hښ"tcK,b[,OW ǹTZ+W_h])}6j,Yx0e7Pn;C)r079>rg+{s`':0au|q jWHaT借/h`w+bJ `Eu8}Wۅ̗L|d ;.hEN: @M!)F#})s¬B\=2 n"E4e8@1.@)Eoz\[lx9ed;^KR0>ypx}ٲ'ː?3βJ%Q3uBs J 0eJ]c0mŸm>,g.x#@Ah@fBL?p>&I@{s%|T_8_KN˘5,)mY邕Nʬs)K*CcvI()Fns.`cLмI-bxhj;{o Zvmn;1GLflbfy{ -u杲PieD@'}f_|:߼{s25N''|_\;62QE )#Z5ɛ`u y%8QN*"s o|Vs~!UΑ>Ӷ _^4߀(~pK2rcdT!X)s1Pu[Lr7Lm i hc Br|D$֙SohILgwQN<`UqqȾҡ Kw7hlM2LcK.KPj N ]ɉM6YwWC\~,:T§ y oѲ($Rq}%Uu1U9ХwIؐZnUzv>ƵJ-k6sJ>K,h`୍ >AdTS9o/J4G@r -c_\) :C/o8PR:?P?4GF4/?}a#sUarQSG(=~-g񈜥ks(~ǹGA6u=}lΛ>|g\?.<3]a^g~[;{/O~.Xެ ~gz7Nf |܍]Q/aۧ欇ػݍގG^ vhLGTxt6TR^MS OvHuTBYy0JHa$h><܊2gpϬ;-h-@xRkȷ=)M;!'R2#42g98ZNLOL:('Ưm06 ՆUzUh:oR1u!VOX7Θ;1g;3ƕ uHt"XU1cx%wU$OvH^ D IA9PI 80ҁ$YNQF8 :D\J8.p $Ϫ&AUI !#zj1!15ehJ',FvHT  Q6N|5¿? Z vD'gp9PFEILM5*43KB 2ԽrDaRk(c$rqq#I)S>jKS$wmKcܒ۔8F}FW Q7g0H bh$4Z&>iʘўsBL&TBi!$Wx?n )m={ 'hYTTDCAJj) +5C!^fǠk}º ;|A`BRgAzҥDjONKM,x欌8%'qKmM;ebW #2pH?r< K4rAdz!DPJQM'HeiL&\sID(IrTrPi.Ecp #EL;tk{.|~̭[.zi?KS#/_|yS%cƳ"a.zC #OЛ{r;3ŵUu=ęĔޯIogԠ)]SQB9agJ+)gn2??LQޜVg"U SrCYU^@ZxC0f)Uk{Ws;[WE.~[=Ѿ8>:䮔ʢ>&gaF8/&FQ*y5n0|QyfL菽G3͚g#Z~=y_?nz=*0[CDrܸ4_]Vۇ]qw";d`V4# ~aX05*Pd# G~̫wppXØtmͣ.&4j\5&٫ Kd?!w}7(8g7u*Gj~fOB.*xHTóAPfaWzF,?y5U[C0U^^Uw}]˿}opI|?.??{BIH`~>C݇f8bhn.C˒/.ڒSnOOǥ>u 1c>,H3:X.ퟔm}KG;>I5_:Itf_zT>/` bWIJkDMMurU 䪮@ ^ cNT|p1(v=Us^Z /t(ЄN ua]TN[-^4DJR0Hp9=1BIt- -̫Ѡ8V:@gA+LrQ8N) (O,&TЊ:\ۗHJEBbS B&jQwD<؆H ԧ*# @^5>P}Ryô˜IɹdDdy+ ^ʽ媋iO͎`5y"BɩpAZG=CJp4a׎D'sU"**DyJ5l:'q*l )%aQ$UDQms+eK4U"$IO.+< S`<,/ yJӤ|bFQIh-UdTV_dw"j'ٙ4qȌ KsQR6˂,`9:U_'J0 ݥ~Dl-l)պO]kQOlH4g%! 5 7L;KSfZ7(T"4~f %/H]ŨB.W  ՕZ uU6bU!¥+H麺*TJիF]Ꭿ3äέHQkyۨ[ ߠWWzn2vA,bU!䥨B-B^]JQ]Mbߏz^/+GtFOWۇi[.  Ly!/1 I&GVZƺܪt . CkRCa*xFd:ge4`W(5Zg㾃ݲzz3_ ֐Y7Ow%Wxly;X>cS&1cw_m\z<߾KA&dpLh-Y=;J=ʢZe^hsP"M™ Qf#74^,*P, 0vCS荴"2aXQ2*RR\Q[ Be.@0T`c5rvj>.4~֥2I|I~|yNna(m)=LogcVIAk8m0]md%&Iwp,%$ LW`9fo*$:g' B`P?U*x<M&{+_} ug=3y=ˇݾո8shܠ ûQYUp(u g";Հdύ)ZU;C-\nݭ#$iXDt}r/nMqMu|^(Ǎ:~kr(8KIaoM)zғrTNc:s#wt^7G7"X0RKINˁwû/{nh岣5^y,@EÔeȭ̌\ߜR:5R%rS2T@ nኄ"ې0I#/D!B4̄p5rh))4MkrŮ5[[k@ٓɓh\Z,VB̐K\[wn8k[߬7Eo_ĝxr&Sm& ^sqBa;oFhD흌L:/䥟irHy6,AАH&_jFet1{J 2&fL0c^DNQq%D֌١QFr]X3Յ..<.\RTmPv4 n[NM0ş h`02Lsm8Ĥ#" ELZkbhQ,Z1$Ԝ{O*,bcdZ(^HQԦT:( 6L36 !efa 5"ga\q.qG{}XCUe\9J9A&Mj &HE]p>f,s5^l6_T/c=1k8cJ9B Ar2j>H*)e2!$@K9Rޠ9%^Ҏ)i-aGe"u>*j|?ݎZjgAJcV`3+i~Y)Pkߑۏ 1J" A *`*}ґbDcD^Rĩ R. (gAX>D5 EjHjR~,K.3_ND'-h\}]4`^oδt1ns>ǻ6?hδX[Kq.mO\vQ`VtY/7tRnV rS |/ >& p <,E#6O3Z"̿4Tz<\W$^0'<^B P//xaZһZfmDɤֆw~;4dmk"A/d>i庤m5~z}<q8{ShElhE''zذ0#07G(gaɖ99ޓ.`c B|I-bx,|v4݄w>cJ#yoha_Nfީ Vu!:׾kfóK~PԤ?q8V87ehÓU&hS9Q1ˉv)d&-2H漐mgL.C">R3iV[i FU H^rI̚I"]\EPd ^jJqҰM pE c}cDzZH,ȉ(56lb,}Ҥڨ dEuLyߜ׵m=B\נ2v6]=F2 :ާbJSՔc,BL-JzuӴȭ>7^18}]{\F{[|_ƴ;ÿRfm zжj֙dBHkz&ՀSrM5hkb N}Tw?rs/E~k&V9:+Uq hJHR2KJ^| D]&Yf6༷&.2eeO;ygT@FΎ yrxlovnT?얩Vxz$r#il &G3yOd!6ҢNFW,[ ֵ]vqpcWFnha|x'2qthROUVz51]+NL꺶nj>=Zϭ䛃Lt4~~A!mho&Xg kIYk{?rk~i 'LQ61:bO3cG09qDaHDex\֣AZu+ۿKݯKqT "ZX70mx5:V5ـc΋eU`lTZ^krL$c#G,l% LF Hxұ*\H.HRdn3p!+ 6n$Bb/ sN a? ~#K^Qvo5IɲGeqO"[fUuUuS9lVJH?Bw7joo^BC2G-K%&zqyAȾ1iCB;mv:x-z4>lF{ .RXk#%Q2O@Ɣ4p-5DDp90I`5^ \VBPu;TDQbaQip4,A"A<> {< [Gfku_#!Ʊс@."LZ22C:g#eGg?WSҎ 1zD0?Rn r%pA`xpE48S_Az c+-bL)( R9dg&}Q0e(eYNa&;[v RY㾤F¸7_â*T(P ][ը3Qm,x@0o83SK(ڈVj5]!y,ԢUo@W2ETc@ M!_S)( vr}\-@.{J`UtbRYK ՘mNlCaZb,>f:N7 SE.?gGy3Z_J(H -&ձۥ  Z=<։3X>ƌjzy 1:TksC}⬽~0@ rVĜ853K̮lrQ}̷IeuYHok=$ì'YZ? YLl:^Ntu3|pvQ >!FzV*ERB/K]_V")>Ŭ oꔡh/VjMʥK :7qZO}_)`f θ*\EM?6P%<Nvg $y?~~zwN0Q'ɧ`Ep| M¯|5_}hB[MehS˸G/wR; QsڈVz:7"+&Cډ{2t1uNkd 30dtD:FPрB4B JF {A^ t6A GR3#SbbLq!`%9k m RK3,Իuőp.u٥%V5S*%dؤDhL$xX{?'v8LC+8ivV@  % BvJ鹷<+FP 1؀dll@^1{զ`P޽8cuBWT1bңkkH)a󥖜N)bxЄhW=,FU^EUr_A&/URMTkz7LB3II? 2݇ETSR9L6ecdh\*WjU+ǢQNH1c#%J0[LR} Ha4QXdX^` AC4j[LJR k bxU~EY( uct~m|d@&&Dlbf3&f61Mlbf?iLXΰ;rgX ˝asbdHA;rgϰ;rgX ˝a3,wΰ;rgX ˝a3,wΰ;rgX ˝asAZeیsqn3mƹ8y4f,)6cpfیsqn3msqn_C80Ks152mƹ86flReRidm2 c4]h#`ڌzl} 3ǣϏQ!!nP,.Isf.r[TfO:Ɔ&94/Vm l2ҾD靇8 ~> E\\Nj]?GsQ켘NFkZ] QOPB&=>@rf (T/-;dr?s_{P3;h=j~7n:.U)#E-*qdK0F2aF9kOYn'0J`NT5⺸s}7<ЭzZE/VY?/|NLG k,Ԃ:Vc&ye4zl5pąD< :sMQ)($1=%lr@9XrKi"ƃǢǚFd!8)\.H#׼$"jxT8!V cPNbXXHfz#~ xsXBZ7()&zSNq飶ěUȐNGR1g&[3Fٷ[3 iϺf]pEQzexm&65u{LfM 3LIDEID k `Rck-~RE#iHQk %ImR!`V0/2.D>@_ܯlלˡh޸#kl}e ,mH$8j0 1 dT (b .8Y0hU32 hG]r1$0d#Nu{#~}X;)x>NUP4boXzֈ,kĬPVZad(QHLd$&`u $',iP΁KFbu}fۋgI˒6xI{Bz;E8UNUU7*e6nC>QNK=PZ抱i'i2OM& \lW6~ GϷG OjgT.#L.} ^ON3/y؏\)q~}=.cUEM@NU6^oJ{uћ&6U.I:p!CL& mRn۬po$eܻ9kQZmw\oKKztKh,_nar>(c8>l*hm]-]W^gWㄠޚy~",uyi8^%E/]'pJ!,7B "2s!~peEhܤNOz$[JP _b%%ci8|J?z3T͓.$!EV 1R_O/g2!AŲv/Te2m}zf´/. ,7Yn)7XHǞ E4A(Ŝ(aHr, wӑw;̝V[-Zb{7I)e;^u<\WY.#%sdf̂y8Yg 퐫с+ 3 VW,5/^:Xzիe'-%oyl'=}$meaY I gMInD͝n $%dyL0_^ŒVpnTgOLB|oqo]NZ;4Vy=.V6m'&^vLP~'q ocը6QMwZ'z@ӯX'"NX뫝]o7oʊ oOb@.%Z(/dARbO%xUNKgi*M˜474XThLJp=AKz>AK_LGiG#BwJ?$@&p#7P4&G.:e3MuS,y,ok} 4^o|;qZ|ѹeުº́5WK4 ֈV",fѝބٴ?\G Kup)M 9&Gf//P<ۇa+ wqH l[|)n|[dK3ʌ9bwhFҼX)ò2f|*PA V:,9EUL78"_#K]z/Zjc.,pM-@HB T$aUAxSΒp䂠^hABU*q' &[rHls!jI,V\AlF< 6+NS<\;Y|>{ޱGǝ5yKڮ,CY#b\ (e7QBS>?Zww3?b-]%8@+/?~O :>=Cv9̕QM5-QFl HetM_2@܌>>#iHn iRw@".hD#5!Dx͌ql3ӫ3f7vXhf'ҽtبI^kTR:s)ݩ!0{ qs9rޗ@cg M07cd>8:S 9z'uvq4CeQ'%5XL1SO!mPugJU`K\>R љe@NQЕ]wպՋ~{Gk,>^nh7].C'vKUzK|n_M>,JRg,/+GI2c}5JZ^sz(ͮnNӯ-<͋~ }ݿ±vw&eᭋovҜ[?mN()Zna7PB#4tZ.[M 5LP`CN~U;U~PmV=6u ۬/jJ9S/[S+5/2TDE2͙%m RҘl +̊Lo+T5 5'+ :mɅdN#8MPk#9_S;"C~K-I\e .X:z xB _Op]~#ri"j 8oh5|TmgH4L:nNZFt:rы[ kʕ s\G 4cKNr-4AS2G_Puy4B 5Lddc.1٦d!I杬jg-x0e\PK=`iD!y|b^ Ai-S@F 8J\mM(' ofۨ,IrN&%B9 Epi0RJV\"m<"ԷIjJ鱀96m2",ӋǀhJr"ɠEI3σ#% r2.]rp2 HMl"-+8ň xSCk"5_5Qd`\f24n)|(Έ" i4PNKk9G!Z\b,C2 7 EM|_FVG71 l )7bٽ]EցΘ<ea]W Dz!sR5dwH3 lkHhR2j2k#M*<9όC -S2'`$,IҪ:vFsܝN$tz=5;BbYNkJ韕Ֆ(9)I^h)F8#ͤݳۻ|qIGFp<ӍV6Yha~,(k(%DzoeԞix*xƘ֓^I@:($C& Dk=O&VDR).T)T(e:C CxY`0]FZw6SR(z%ZJk_ʼ"53ĂW91ViQ,y?ZQHOƚW$\y =;&<38l߼9W$g=^7㋋"mha&3gٴ!Cʹa3/׷'z$奻s}he 23j]Yڭ~Sa.Nn֍yYen%nvV [B @s:_SaxAEiZhkQ^?`?%v\- 2`<0YBW@gP<8L%La݌;<v ɚ?ƋR`-G8>ּFH^V_ꑥPퟆ^ B.=u7!܌G%^pwy_hqލ'[$܂rV]NϞ/2ho#Cn'_R[$2WF5:@QF;8!f3/vUonFcn۴]ZyrUT.wKj䎦R7 }g~nUߺ7tW%EHwM +py>\Qw yaP+t{6'<8 uœavL-Z'vжn]e}u M&Ed<v/~Z.{"6C r<w!2=x`)jC|0vw7ڱDN(ymO۫}'>fC.ezҥZkΚRA9m~)K' %5ٴXf }.K; ڟ߽gnG~LCqEwOÜ wGťl뺫9?zkn{;]m4Գ@Tpإwےw:>o=Z CD0,B=eN/y>,64[+zi^>v%g]+?^_\fMm{sw?M;p?uf} 8$:Eh5}O]?\\[^3e,}tX2rE/**' Bʛ!75AdD@~[U./PNIQ& WFkt;dpb"Ld-J:gqP(ίZ'!B#z֗B Vt\oK~soPu\Tގ~1;cK8BxhHMM73aTPH㰅4Fg. 2j83e6D_P-716q *A$'%-ШW? 7O5<|{{fx3l^Yy+@$2Ìc ='l@TnoߦVĝ/޲C6ja9 ] ͍K="J<vvMZkZS)1HiL)0YɄרj&` F,kEvT~W10B*i-pT 0j t1i MWx_n_ȪL:yi21^9Kr*QӢ[']{ '\V61MpRl:[+t^?oUf`4(eIyÒWAdY`A)eHE*_JFmM1y#mm̒ = ˙x "mMk9WK\KmX;%zrYXmeTʲPprETPw βǯnvuWOðgm9Ĥk"syde٥DLFsB3e,:&ԜB U%2C9@ElJmPCpd2uYKtƔ[YbWfƣ`\Ԯ:=0ؓ)b΋,xv8 L0 -ɴdf-HI6`BPUf2=8L̐Y(!YC-8|YEdT'kolWfy8"E"V[/PY" X%ߜid& рErF&HF#%G ͶvHvH;iu%Zcq[uQnvd㉆@iP?Swmm4288ix'$ v'Lf0/:eɫ.bnv$Y۶lw8"WbJc"&5mc<7I-} ,%|)i+vwT'!GR!}ЮlIvWsesX_[LS LDTQ%d$ T JL0OB˔JqN erRGƢqQcRE#7('V*F߶K5rv@z0W d+B++JN#X [(?N[BLQF9*WMQ) e"b栓Qi)]d` kYZ#gG9O@厷0*>Y-Z7* $Q䋩BD9RSU6ESAO?0KĠH#UIA%ΕU0:"0V{POI =,>""?Bs1 Lڦ?JR<9LkΌG+š{\d>e߭ xE[1Em((\(@s:.&BH%B줭=62g}07%ZcCă'expk# $7V{o8FK[9։G{cu5W:TҡtZplRMyCm8.R2HM(DIzY/%E%y4N?mD0'2 JƘ+5462')xʛ.sѓwz%ZcHr6LkBA#,O6zʓ6j7/1|`r~({S* 7yw>3~m7t=ۅwﺽvK"VY@h{׹3]!קvi(uez{NmnGغuزĖ{ѻwmyyγ@SpWZn\`s}9;e"oxZw2-kΧ?o1†|_^msa=5j;M.E?-p!ß/Lv Q+R;dzɳjQ.j$0,RQrVwWYWЋ^^1F>梭-C|NBgMmq|frϖqԺao䋦/u5Z4;xX5ʷo>~(6ΨH l{=lr9*t b#Ĉ"Ԓ]Xfߊ0~jj_MƗ<~ 3.@沒 _[ʄ*>QTe *\Yjl:tB 50j*ծpW{y{z<6 "h 0hE@&E\qA)̷y&4J# 3j(C(B ʐ 51o燔F>̂ JveHpwhʑК՚ceL+ǢP1K}H@eB{CH:~PB#E=dhDF`sP4'm4B!J4VQ7[A܏cPal|h0#S.5QȄM>+UB%."`Z#h+hѱu9ִS3I9w#Gn@qCI!$H&hC}RJB]A' SLp2 J0~n3Y$$HlCa{*e]8okZ]XW-N6β:H$_WrVuѩQS|F4WL-9q.!y`箧U,FJbԜXÒ#j 5Tw韯mOߛO_>I9&5 ގ۬wCϻMi[Cx -ڜlw״9 DPPM?烘jG6>~'>%~, y1ЏHBHHY.crrQ pEr@sJ2)%30.Xc<yFx$bdMBPiM1$( Qc2H!FA@s)h@I&Akl̍n]A"kKE; S Q8)H%C I%3ۋtr8⹥ !y\G'H"ڃ专$=GI 0bc} !ڹ^wdjGB!R'(>{,EhZ'\DLJJ&CcI3/}R\uQ^)c\-D(9"ƉL hYqVeXqlb:I^>ΠY@t\>@խ\2sDwd|1h# kD#-q Q^kjE4[%-yZyFjWbYغU3UmjjmJ;H@J kVMl rńmdy#2 f-#3C͸*/ ]VWbPKlyR ϩȊQZ1ٍnņ)H!൒5tyECR-%}X[җWeNzߩg2]{l݀G1b;u$z&XG=kcɃԊATR?<fL6F]Q0챨L-CWWJ֕mx0L>"u G]er|,*S{{UIЩgKceł]=t`B~.]f\Βs7+om.r4/Z C񿐋ۛx4i\M;$dH+ sĔũ&o?o1wb~q6VA)ʆWJ-sap&O@}71.):c0 8Vcۼ,6;o1'*>-޾z!RZXRFS= gD'Nź+|8` @r u(#SIe2ʸ)q氾Ak\^A1j]"Z6h\l:_U:>#];#aheV/8d4^=U+ f,NQ@^Xߟ\n0f0u&c;?t&/7aθ>٩hfSدȊ7.o0{andZ#TEɒ sucY6a2*&GC^S2[)y沪#D8-e "+HHgd"qL/E?{׶ǑdCIy]{w``k헙5y5{Dvky̿ωIu5[dyE-:;*22✬<ɤ`[^A0kO*\3awal/Z=Mo?nTkucIN$QvWig9&UA;]26F%6K?)5f/B"Ug}ʹR7VB:(OUi쇅kWj5uQT*Xv6E>KkZ-xd5jA ZZj 9ZE/\ :#JcTx}WzNQhYQkڠcK,ϯTcԻf*xm0YZl&&MjB1&Zj;"3:nJ9h춸\T5i)KNt.<a=-&^_M:ȲzY !s6lXLBM":JJU0#Gmⅎƨ:ރ%vwxfg@+ZAC"GpA݄"&t]?K~d|HEbs bFgdɘu!gkޯH)s9jQU-wTM%s$+z[t!(sRB#LV̇%J]0bL25mRhі8: BOR:CzijivJceOI=i],>YCD "f+{XMI#cfC ꨱԳRȮPB@&A եfQ#2L3RQ/3h 2Q[ݫ N6` 8 u|ԁn#Xe~0@&ܪCZ]%WB['K75!+<[dX %d4k FCFzz?s%n`;Qc(JƬC5, PVPB ׎]S@P&F#LI2f}+~Jc;[-V-tC*D4XYi)̨HG1"]Mh$Wid)d{:@=--V !ptBj3xWh%Wk1h a6 )B>XY70vD )_J&:&|K@Z&r QtGdUmj(pgBQ!]`(ƍ VPfRᑼ# 0|I[mHPSAHvɮ\F)^eFT'E]%I/[.0gT0[QЯz $kIk (DP(46j"b $C aUhޣƻ ~,B\FФq 0lnH3fD' G1!E B /ڄ{0b8þ֙{M>m9#+J$W{H Tb0P <9?n _fRcA39zjN(dUʫ F&deZCP>6|Oy ᾬ kItK, o 7|tX:VzR$ 8ՀQh%wV4<¶MR xY+A";>n'1g^Ls@ Q/сwK }CGzv@ @z R{XRr*a3m>%0ZuIBv ԁIֆZB ;f!3DQӦϽe;VA;(%!ͱk?n,BL$&شzX0&a!dE De#@ [҉6k MJ!Գpg#Lk4ZI0xnAmJMKKުh{9ExGoR Q$4y5*a* bV-6mzvF~]IkZ:^ms#f @BnðnNN"i`$ti#ll% f ;4$J(u%f7>\JY @ )Y J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@zJ cs  y6J X[C9VZJBjb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V`%**@ Cx>J !“WJY @Ζ@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X Jr4O@ត}6J Xt%+^V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+^߿ޫO^4mwqzAܥzZwRG@Kt< `mO^DK/At.3(u2Oɿ|]'E}Eݫ@\=l(> % oo0޾4 t5ܠ<7tޏpfs?WMW WjX/ .iOx%j38DĀpPb2^4Y{OѼ hp~6Viu0Toex9rZ`Y!iM }6]w|-`yY N2,됔rG ^4@_IN#RH}JՈMRfxhW=^SE~i7ţ\ b3$7?$a.ם˝p:ʁӷXzKK}.|#N>._s-Gd4OUMfp>x&v = qz1.e0ZVuY4A 3J_}]ǸQ0ޠe;۹?Rgu.r;Yݑ>C<|_Zewx=in8~*4;}hpU\]ga ^w͗c]mس} ;X] stum͇te+f;t"|K<,,p:}(wvv+=?[9zr}q9{y٪uj^1³ߎG;ÅO=ҩbIMo:77>ɑmK22՚^j1 +9RHfS{ܶa;hg9~Dy8К>w>lyڙٸa;6h!S_CV/siJTz:GUF73$@"r}tw$o7ŦZ`_xjSl^O=Ȗީ>,F4%} bpYdy/&U%}}:mSK6VZխ$]MƄ~s'3ݷ&_>w:,d )M촼UUVOoz3mmuSFV "EJ/7&KꖺiC\ОIJJJR #w;\{'4?ػ6$W,0%y0^wϸӍF{,򈔸H I-oda^R".eV|/"㨂g<ddpBŒ4.4|$BG#| Ei'([g5=UY숳W_ uti.H{CE %-UL$X£Vb`IE`̚䡃@M\}/.w{^$Ȉd4(B(yrDa<0F9~K6,2b`$QzT+NIK>a؟{uǒ~XCXR ϥt kJ-1o,H;QEpLpzyH+}U`;lUVKWmxM{΀w٦-1 nhnQIfD8HQd DYDoc'PWmvP z4bNJrɠ뼍<,tQK2t{=tT[!EG4^N`W+Ҵ,Y?t\K$!ۏo%eX$ ?Alh(u:5L`!gr7.%̼Ӆ=4KgУQ=[ eu)uə] <9ό!-S˒̉hLNx'Kyܝ@:#ІȆE[ah-kZ2;oh}q|EGl_.k7ǎ {;ޓŴ,I4/8n`0Tg(Q2Ct< MA4,Ϡ# :JJD(Qb(ك1}ݣuP$+I!UQR[ѤFĊZwSR(z%Z;o&2/v&4gL*%=՛`G}D}N}T;^WqNRb a"&n2dX%*tP u̘+EO|ڥ`15%5)urzLKjHug;[ =$< s9L5 "'G !,RZd4=!Je2{D. `]m%Kdm8c jΎټ R)HƓrdt)#D(#J@xBXZނ@]z;Xփ0;}@91r4ILE^T 4+&dra068q?p`Vn0u=kJ2sJ]3_ jdHECXjQDkWx՝~Eefk oaqỷo%=Y0n) o2A'ib]zj{B*v!-^WsAEugŕWH0:Reɬ%Ά,X.ŸRH:wzXuޝZBr6or:S lqCX˝ư $A&HTFELFΤ=FpYICa9´Ṗ笢# !=5ZT5y}+g[cvOxݻih_ۛnyxz~q'SMux:]nOj`\1-ˁ񟗴OUF&|bbfW޷^NMVtFQw9y?=ͬt73{H?_N[wS;`l]K ֶjuK1bm3lᨕO4e.>=iޟnqk[lsAkuX f%MA.RV|2 FW1?cCݬR c|3$!=__›I/+ze6 UJ^_HD"w???paOǻ>'z/Sj]xt~{\3_5w?޴MK 4jz>ỴjyM.7<"Ċ f#A{l?'ݨ=z/XI"H4׎Y=s M01rH2&Q2esv ֆt9j p]R]SIȖgڃϘ2PaH, qйcwSP砪T[Iu|q7^U~`K+2'3ŒWaSyxU:;7ܵ/|Uͫ{Rͫ_~2T食|4~TdO_Fuex[īa^:wsSE3$ I8k7;Y:W՟>1AC-= [IHV!t~]?ζ#I^N˟qvF̱ siw3׽_f~nݽ_ϯ:< ,^jLߙ^}M S?@{1)jTKPQ:>j]6xt/q4!+?m_*_bV 2m ] RFs& N.֧ZJI+g"A^831lbP|ttsHWGnpԕ,$XiP8R@I>M 0E)c퀖jٮՑL+FjŧTG,cݱ8jتK<ۮV?ϟ;Tf">}QBevԝȆE9+o) h&N(4JU;W67Ѵ5Uo߲ ]vc elu3#戸s|4lU%W4(oP)SvN͠fp m*KZ0FRqW #`VJ' mA,fmHLI:s%Q&#(ugǣq!%Ce#i>z`*\"W҇15rz6;|V7=> .ڒL$aUAx0gIy䂠=qJ"iW9ƒ=9=Jzm6BF$"]Zw#x%p<yOڋWdñ,7f'͒km}r@g4>?hN'oҤ ѴO58mFӭNܖ?\A$|[vrue 1Ӫ챙Q6bfys#jI߷rpH8ᔜ\}&-SA 7^oo `+["p* u6|91دa<dvLq'3#E7#Fc"}#b%ɄQiSUE`Ɖ5"*[I+^~KE!*cjQ<i qW}[-PԫyߥVRUn8_dk f\ o}ږ'+hXktRW;$0:Q/>H)y)1^JealU` |0WښRYi)K.H`* "yL6gKX+ůd֝jnd)ة2* eeY(;Yp٣2e_ ]lmt'ד8?Y=?>w.-L&1g,RHhDtNhl%'HB$FWJ+34'dB"66QCАpdRuYCp:G,[;|la4LCVG';"!@yώc$ %5zRm3)d Rz #p*3)>8L̐iQ!YE-Pب8bI|HY%IFklWvy* ?"E"V[PY"B';XCX9HmE ɀE%F\Fz!~aۋ6wHvH{{P,W?kr=y+vJGuQȶud{r$qH662J,wӴHBzƚW$\ :wcO?&Lr@7{%88h"L0%ԸORoH7)?ie~u|dO8}h¼X>~?tqXO6vkhA+>ۿWRKӋS.0dY7'KC )|%lx5!v+H[Zt3ʬ]m ֏wPԙgaٞ99ߓ.`gBP7$uw- ԍ{V8LxKVl&nYC㋖vyf 5:Zhuq@2/j68Ox+mT-g0, `y.yL҄uYrFč)S4'0jqjI\&No\m꽮l=ߝs)ST +#jߪsRٷMnRh>?AeOj,smuFM ry8Wy]`N OLcxE$K.=&U[ɒV9ܕ[E<'ț*RCTPbbC02WvbsH6R`N>.yA~{w刎ٮ=طvwpEU =w;ݺ3W5 ll˓\U7NGxi{s[Ui WQ&,t; [JCVzuJ]=oŪt ;LR'c$(<ҸYLg RU&Wyƍ;+3}ڲm+.}U^vJOjIP)ƣ9+3v i[Gݴ^}~<<޲'Xh%uw$?g1M+5[moxL2{+^s[ud,]3֝NjpWopWopWopWopzY:ܼx4߱Y+<6d'\0&JÙz oIģ|ѫm_d_di>&PɈ,+!6E $:-2C̐DX%tQ//xRO۳xsDxw]*`RaXL:8!EZ&QOQV^jZAxo} K4m%ov} |{g^z;7ovwGsȫer=mn%huC힞/Kj7Gn)-`wG'q:-ha<(.Uo~7L+RK Azy:?W~Qh;^Jp'ӛHyiS! ;ˆLMȑ75PE;VV#kx/jK@h.20^S":9%% FG'&C/'9wjaoi̢xl =C&<~懴8C:-ᆕ\)Y9dYKM"0%$)cUY*sc),297P\9"%2n:HrbT__9N&/ӽNg1m{\1l]f#)z , *Rf:ƐjTdύ7!ܗ]Y=Q,7TKxG&W7e6mNN?ܿ}7؎+LZ,0oYH, "y/ .; KIbRU-̊1 )ڔ&Q`)cg<*or̒u5v5rk8Zp];=xW1GAd'o+(f2nHۮ2SRt+3䊢$ђd.rIGх!k`jv5r6_X?NՈFTF4.:2Y !l,$bA8r|LrXH8\nw1cl,`EDɑxVK"i Tn[jО^g5.9U/zQzqЋVY+>D5OELw02*(կ3QoX^d} hp߷u6;\nC%)tfM r3 mQ?/o h7&=H 9 ̍z-rusa⍅ichxa҇Zf}DeRkq?4DV5t\nSЍfVP烖듖4Kruݎcl8"mwV@H+:{ 3#.;3 X?AQgNFg{|>|Od] A3ԥ-sx,P7j?{IZMFc0f,Y)Eg1ˎ/Z )杺Қ.DNTk^a$f9[封re^l8q8V87eY<&aX0YAZ (]6E8'  1ئ*MShR(S7N- e-+MU͹s7"{ujT{ϝwW)E{R+@Jj;TҘR;^=^za/7)I0BQy]kt7 hho3eUvԹLϋtLL/sR}b n_6,"W^u_\;XVn%KV۸ 1sf߼ ekoJBEYDWuPi% %Q5K0]@Ov|Q}#\7.㾝DX;ӧԖ$I*I fZR䤵Z 9=U)k)|Tua6gR29Rb mIVkɈr/y xƉ;U,YyL s!ZgLDɂ9Y+WbLD d0ě,K+0KoT3hD@]NYx!+ȘPyO w=}a7"X@̘EPĈPؠ`RV3% !#,XrWC4I+w,#M?J/Q%/\X.Ric$T@)O⨢8z['Nʍ}|r{ڠ#o 0dD.JSZkY VoGh{bS1EM3 h-ӃCLyÓҒ4$y=-YRA<s59U:le&GАgsOVǥ[sOZGd,d %xe5rZRJrj{-G[iZoI73!pndlIw7D-Λs2Rc5RبM IEPݱX!l!t4Nj){NY<5 !H8OLEpyd<,#Z%HH*eg-ţۦl m\ڗ"UK<>@kR"JA_4|ќ3)u6a4ziZV g7cm~n>~?|o Tб&#uY=tc~j?suXZxo[ `]=20F΁Ym fy_Ϗ,[v?vfvԝ>|z(OD$mP| @dm,FWg|XݭU'ݽk>j5zz/g^Ɏw?a'O{~՞S8\C=ІgmwU@VlqmJ bs'Tg;+0Lo6/4El>5Zk6hT*7>8G;MDhVN!t?$Du1)_OV?>MYG&b"<3b A .Ⱦe1(V*N Е8-62/6roVx"YrHK kE$ Y9uAQ22Ġc4Bg^3 w691+ZԔ:eۍ'!9ͳy5VbX0wCɭ[PEY="4^QLQ `,% `c``IxR)1  ?=~?tEɑUz ^ETdyPdO&ךQr1H]H*C/iQFv) 5b*-j)2CbLvEjQ;mW'^R}۩Y*5lTz0ggqj P左k*Eul6s0Oa]~.yD5vhMq  ,|\55;x'CJ ;{Zr{S6{eg%+J>ʐ*+6 jqTz)n>q;JsaCetZŗ|X/K?8r>8 ih_ۏ.p./5TU0{>7q{۱Ix( 7L0ң݃5ȿ>7 8|=j1Dxi{ OGb]Kk[\ffZU6ѸO =hYU%nouu^̤iꗅԆƓT}6-1 ؗ{r0W=Ōh]4Cc0Q~{4Mc|Jk’j g;5ȋ xƓQh:HGRkTϺx4}ԍҎ (b'<_~WwCV|t՗!r; HHE$}eBKYĂNYU.y"X=Hzhºla͟E9P<L$DT5\bKg,H(GJN':[U|9x'JaZW/yܡ&U`v ݕ&UZ%IR Vżl^,R(g 0d3qƨhxx{ZV0 Ix"qr$em0FD`QGRQ"[# }<;MהєAe!4dJK$ O%#d.AA/Ól_^;&к$ks!SqėD,*g)@ cazSD.َq΄XtUh)hk3Y "Sc[c2#3 zϽrbu0߽x  |ZjsZtr2"ϖu ^r-BSWOE}UdvQ{#}Q8m# 6UUJ[.3TNKմr]B*VϩL' Z5E{d)`KϾN mk\2žV(9,9JUI5HD(A98i5d \T>M<<$WcX$6 GuOy]q15;pvs\5~9UJ'{ L$dc/bc0<%ܚ Sl`P=Od|:5hzTFw@EvZ))5 qӣ,I;0qu&n&ٙ= y^<`k*ڨ3J={gݿ_n!²_,amnn$-`{ }PI-t=? NNyf7s/aߩ)p֯?р$ul xQ@9)ћ5fn.^}cn@hz\azʝ)Tոpw_+չ;dX`zgUW®*nП~ W?O>]'O /~exiڜH`Q)5en蜂&K6WvI PaH*W&wxct`+pNQl0FYK=c_ s>ِ'&ba/͍ic]bspo3Z2=ë4pH7Kֈf#^DM̲7Ngї#o ; /RH䈭;D2[Aȅ򵚣GՂ.Aj] Kt6Z݃T.#<t.9 ."c|xS3yi 0$\Me<*Q193q&G6Q"xJiW;KBǷF ^^ϲxj~W)z~E;NL]u`. ''Y<c'юF5"f]a }16;f M@ $ᛈ7ݫ@E[zo;9vb)N B1`ѻ8aˢϐ 4wbjy Kqɩ vׯIW8ƫW<ⶾޫ!5bϛ+oˉ]>uw3ܖGsW/?; !h t BYx̽Y! m۸gDkHޑIATuS2*9I"ɦCUG"+u{bIZLl!e3pK(Z *FvJNZQR':x*=j'?hj'ljp8veW?ObW.ݙX^\!VɣN")Ԍ)E{Dy{-w@t:W+)V=y.|c<bV֡V!IWivaR$:4ӝcC<_i]]ρרП_:Q3} $)rmglkvt)3T˵ZGTYIIa%Rm0* DZNu6KL\leH*oT.,0Iw w;瞩'1D$ 4O&G_Bvنo_^U#ty)7K`I1kz€\+ZjMFD.DI*VQ;%RlSSj5 ECpc .xEl~G%Hkdt`$BEn ^g}z<Oi]fg;_}r[țk0EWg )'Y~˙M}΁JuRll׵[qlwyL kK $F֎!/%G:KR#9{KRSحc46*du(6bffۓPO;thtt恁""i&Hڐ0^CJ*%YIP%pkIhl0^ a 3S.A% v¥afgܭck&ŖIFvihb?*A%it!T[ז3qQ5V/Ylag3㡶Pwl uo .%};=5tهmio M <[l'-8"$bŰ U.V% Ԓ!&,u[RiBVBǨfS D*-1Γ_JD+,)ҕ8w[8Kr[vgڽP`EJgoLu2cؚC[bPKh6@vEJ`̞FI˶&*%Î#(b&"1.6tܙ8w/0nEl~<"Bz[f|pOI bM r06&9-)db+mH!_l|? \fv3`2rGEe[-2e'fE,V!qE+PBhڤ4Q 2 j&i*BN|S:Z)pvv ⤚W|Űd_( EŎ/nf!b[a1ؗP–דrrNpUȭWq :Db=* e QM :*EnE([ޯW@AoN=3ASzh~}qUsFЌڵ^lO˺UYettJ9X5ACom%92A=,\ndrw 3iaӸ5@o/Q˗Z(s eveQh]:xas u4 Uz;YnI"Ćyd!i[e7z&rmr\[G@-Wi6n mX bA# JcdKn ؘ04ox`恒ڎ[a{Q)Ydnwa8Xwd)* D ӾվVQ̒ʝnI,f*լsꃰHW"2W'pa :gR93m :=\D=TÅL% #T@5\ p"CQJ0Q6k(^eBI%T`=o i¢f) ҙECP䡭E&{)d+~i/783PL~'A+ҌnJ`_?g,㛒Aƙ]ntf\-/wⲣ]nzs+ b$q²3Kׄ[# H "b!ThlPpGghLTEe2H\ېk˙)( Q,q$9g Ԉ\^g@kK8:CY 6JyY^Ί#to^ cH^vKF~i]bܼ^iQIX4x\LʐՐ 1::ё@IQsN|`ރ.` BSI- :ZӑvV9U+LIKm!ANF99؜X ?P)빐Jqm^mGѪûb8}O=Kޮ% jcP{ e(ո 91"\ L[4ZS!y#2֪`0d_ЅCNЛI 3[bn SbZa'.H&K,6%&UBiGD4 JhH9pVUThCb`bv%y|oS ,cCj]V "}rmkgZ0אmh1"|4X =h[<QJOuO(ѤmS(Tez˕ b=Ɓz_/a؛b, DPkI'qk.>M[?ۛ?G_ jWlD,_LS:s0*Vx_+l5֫miI2B,g=اUmuls.6t4>}ƋbJI"U;*F t$d^6>=ˬ}ۧ0-@xRk=)M;!'R2#424ѲwYvqdpv}!H=3 dе^J /6TYA!8~J4K[=hvtHdn<ɥ"i&PE h Ƃ7$7n&Rho6"_/k ?,hv<(atQCG(=aMgRO|2l[\u@>ȂHj߭yXsى8"/ϋ|w7 snw7/Go#ެ_"ˋxۛ`o7d|p_~\37ȴqFlmM޹}Nzx x7W ͿzxoYM*7@6J t"XU1cx%wU$Ovs,8Rzi NGRP($vC3>)/XA$e(91E-8.%eKuIg&AUI !#zj1!15eZi4@K',v !j'q p-+#m`gRSjqgp9Qkt'C~YEc%dT U8S2ҥ,'K|rCXa븓a_ yZ"KJ'QQR.'FSMJD%M}b;JfIa\ƀP6T(4LjMeD.4. 3x-AE`jʔ-]R*6%Qmziöݠ}R0)MBcNk!9H-=w,dL%4$8 * n&YC}Pm*Āi-y BCX4,"!!B)@ ݔlpOq/`3G|/1LH,H\!H-iϜ$1Cv:l)ѴSK 6p9"sGB#?!A ˼dI#$L4ZO8J)J,!Ʉk.מ*e 9%x`ј|"6/z}&&5r6ڞ9 _#c_aٟǩNԑ]gJ~p.6:U2&k<+Qi>.0^~ܣ[ WAD(b7A8zL98iRip~pۻv5j^LNT&t09 4?'Dqrbc ʉK0>S̘Пzf4oݛ5.Fv6{~fz9~U1[Dp\?4_\Vۻ]q7"{D`V4# ~aX05*d# G5̫h8],nUir6Qlip}IՇ%_F㐻@ozi:#jlOox7?'!]y INg1x{qIh4 '0kS>b8{ K$?!<~??;O8sz;}BI@ <wRCx -Kz໌kKNaܿ<j{G*,İO f#A}P uM>II6Gue IhI͝$>Yksdq62Cd9#ٔՠ6,K h,8B˳y$V4&QG64))%SN1QϨіHNj+֧9iU6b8dhg}0ôF◎d@Kw~jiBM +A V5*I#Z8RDnyDl]\բ+^ Ur"SOx͉IfODYs\2N3vJe;S|z(4|n/_ANʀU*W*0[yʢTPc2k.% 8ݶ\E}(mMDT痹ۂU=uU`U!B [u Np' yпu_fu=(*+*pYLOf/nJV{α$;Ƽ2:QEiϸgHhF-!^^8^u߯PGhj 4/z4ɯʼnFpyjE*5` SR +Vl DҎF]m63m( &HY0J]h0ђogeVoPuIp*&n%(WSNGOU^+ C/mNY_el3N ;2%0%$\Txr'{*xΆ̉Ph29fJ#U&ѓIXʷY|FɘA/ xDd4Mp51$װZXc)>Ip9t P!`Vd`6qN@ (&fԂ# h?J4 ƒH\=!h,bZs.M+H 6%J=S eIJ0ei@yTcITĹxFF78 _p҅I4kq*E{ŕx?Fk*$'% syιGzRg*wA Dvi'S:7L(Gu 5g\2qtɎ>pFt{ߎFcRGgoYr* %Ñj\˅ѺuPǐ}bdA?oKWxrr<\: {ph٧ߚGLK|b+'iHók5 MLn/.Fvcr›E` r\~u?OVV]puW`)1|i{Z1nI-#)X=źaaPfUޑVb޷B^tǛ&vmu6u+2ל4*^420u0F3ȝ4 &nU5GjX)?t& N~/nkMBvBlCgo l0 7LU9b? s<{{oowoy sv?g~{C_蹌S(֑$c7vY[CziҨ Mz695PX#>\GT\:p\mj틍.-DF5ז,ȳD--@c 2DQb99[L50.U->0OP,閇:1lVC$adór9̽ﶊ, I%k}N#:4ܑ}v:z1Xr4{oNp|o|qUj,hy<HDL:;t9O-5\ FdsDrΫ6, C!LT9@%oúZL/"g9d h9=3J\7 #n73.n S )lfd峠Ɏ0 fުL}1Mjk nCVRGRr"2:2VW\z=SŊ ig zUe٫NO oZ00V2D[hTn-Wm~8br'8!RT<8!zA!2Zbneldl&-{Y{ 5[^ΠL@)+ڵ8g!X)Y}v[;u#}/ΩmqXSKm]O;B7u'a7S|2FDmAx"0w,1AB9hKRFk҃,n!cS>"MX WDϔn>iyG\b)4Pywt__HTJu:EYO͟OBȝb\x]7S0sfҌDk. rP AQfa]Bq+m-g'UHJX M:RTIZ ImK&-(a-/|Z>oa-[#lE#ԩVٙ@&|P9U"d%0)eS\ r粠sI-kHYCՒޣS97ԨUŋpdj/b5րPb >ZL֏ۦS(>-[c2:!\ $6LGr'mT1LAek4Acѕ~#ܳ 'UxL) &YII/)gIG'*'Uߝ*5c <4S6ts%cp4ZU 5(q Jk؍Ru}t|M/ Sr)idY2mJ+.=)>-W*-f.s.΂~pt:+t?,Ҋ5iEjf.٪~2ٽ~x9_SǷވ.N<9ˤכ9v'`4c5?Nk{7Ew22鼔"hmڧܨfs{VRK11K.H`zAz49+4sPD4#7egl eúPQuᜢ2U\ow}!şP^opm2IT#\de"&Zl4Vp 5'qޓ k42+Zb(^HQԦZE6b&sIYC,i:})r~Џ hƸ-=}Wu" -10IУȴdfFIH7WLF Wd!2䊬 -0ApL6s>f,Q5}9wڨEŸ/1V#5j5b7v5hhJLs`"۔(MH]d@aTDYӨ6km#G%@pvf =e*8VV39~RKNb j&>~EEPQL”='Y4ْV5DcDl֝*.:1.κ^LJEh0∋$ޡf"x65_/ͬ"k5k5F6G\<.IDZx;Oake65rn8_Eb+nt|wHUg猛|6 WLsŇy/HJGλ@tJh%㥱e<rXc^t6?V|>aɚJm-!ŋMrAJ,Zh$2 EZ1"pP ).ө~\]f,v ?}\-<̳n{VKMAH21edus(JTL jBdNU{1abVFK_2Bv ,U8"j}LD/%m ŪχFxXl56I}دWz-t+xpݷam_/x\yxIhWkeꄬBX.&|$KrØTuMy d:}ly^Ě%x+ Lry$s1EA4̊|R)z TU; F8^?1r<{w Ϻ-2`JƳ#:9]`'`K29DxD qfZug}HJ eM-qdh_.r:F|W}{ &_8cR #s"Y1dEr N3vA2x\i7x+sH:+7$T?tEM-NWUj鱆]^m\_gzc`f,'B@nzQ`nwnv=smN+ؙ vox.>+-Է/{576xwePD%Wˎ/Z niK7HD 5cfie>׻C+|@f#&&F\$ClMFIE'6ታl;l@a t^R-ʊhpNvI%\N:KF_?#Q)GkzUs9f44Vqlj#ѐ_c)հo~('hy/g׋ɱxK/"AG>ΠW]DKRgR릝*< eT7nR0 QG֚i\E%f3AC(Ķ)B*5jIL(!,=s"Ҁ1$4,&JZ]Z'Bh՝C<|{js{cn!, vA?6]]3J?d Im,IbFD|U~W_`\*ieVˤ'S3@kgp|Z|ˋR[}9qSx(.: ~7aV7úk#_zyޮ˰acќb3+4~Yⅉub՝5 f͒SOEp:QW0Q\ _OC O0}|g-Gŧ.6Ζ7_yHD-g7|/gˣiOtVθj>^~KٗtOgYOWz#o_S3aևل.h2]Ztj}Sݞ>M)pͣ}՜q2e(k Y,s5Js9i0{ љ\ʴ.lsHh#XcKȧ DP)2AJ$89uzP~hrieoU)ݹGyhx̴q!_d*3>?Eq}C|Rt)bRz2z!CA:eڜYFmKhgeI$ 2M.ZVT9c!Gɋ? zADQvJ+dA&M-*ɹPβb̤x"ME 5΁zЫ?^\vj y]:4^2:y~ sL)HcH"+R$xdgF f9W_x@Ƚ˧6ԧj/lOfYے`-ܽg2*|U,ɝ][>Jgg?q7.eJY[w sh~snv~s篍[$ ^ynyN&;?j/5n=7H'/O%ywlEs#lOBuh*٘*EGjoZnmDZf0k`+T:23 ֚Ny [FKDlHtm<6{K֤Pjm uB9g RbHokRR#!m(Ҋ6Ij_g`cM(c&SƬlNoA!Pb4DH7fP2|rD B塅MCCͅ`]d%d%XX]6"HʈIt$=k~yE=ƚB^lsy"㏉*(IoKyb"HG5?qLEƤkBy&c+ ]p.0I"R1@5 JfE> n'Lӈ<|U+Am ]7g)FPH~ ,ȶNvDN]%"T<8j}3Yɔ^ D[M_jUFXp/Ƥ*FA'FXSY%/C\9E$!WX`v#;ھrC"p9"mq:?U]\_g'Ӫvϖ0^ S;{+3t%{U Oܥ]xer*>I}}3n(nؕ\/:.nݲӖ nБډAe?r\޽|7vo@f#&&F\$ClMFIE'6ታ6ȇwhs^L}S폚ܧU@J@g5 rJdXRLP> Z: *Dy{t4 cf+:X쭂7Ѿy"f)3+|kD j;yç- "8"tCpbI#Q¯NJ8cEV(=BQ Z TKF'-$BsK"K<~Lq.7>C!V' }#Z)do0kz$Q@n&!(JWfXsT֙D$9H(5m!DI)@>HtN)V#bc n֝|/zz\mמ~XvcߦXhnWYlm),uI1ya'S3@kg{7Z|˫L5k;??n %7闇u;nJ=fJ뮵8zi{yX/F!^?{nR)%lz[lohZ8._Ïx8>sd!))3:(B/gFz's-*'6sfu#?MBMrΣBy,[˰{ݰv:eQ?a)y''+,X|}f)ZT aXalHQ:7rP+j,ѠƋ茗+6JG21R&a&2*BWهdJ)^TOu-L!A>Y0pV=q&fMԦmȌ@No°W3ES^F3Dɴ :"6#gՌlEͲ懐))C4Δq0UUe{}gUV~D$)x@ AM贶 sy7!Mf,JPkSQBɱ^lT\dk}`k]3*ta3WƺP _Td&o/svu/A/ǿ` ذ,lRe$sҀHN 3CjB`jleUas6Wc' *dU`""et\ m]cIbbnЇDF"EqL, e5z6m E<놘#i >P [Qeb]2D#0pdN٨[4#g>eD~UǾkD47'@'[A)%E%9!xi)m*XUVSa+VD0lJbHD(K3!hY<VERƚɦuVpCWX/Ζ5,ٌKՋ^ԃ^pσPa[ 3͋M{Vo8Q}\:鳇!棇EXVLSkB_ tJgV٪¥M!ئKRrDhv1^,be@!ڠ)جTv![2)(h )Rf:`bWdkWy9-LJixOn|Oyi.15x|!k`m< U!ꦸhT9 ̑TmṘ@('Xᔵ:IBe6 m&i&<ٱ(!& Qˢ0(rzմFHWgӪ$MTWm y(^T8YX g3Ų.1#bs&kL S9;Y/WlYEsǐb>'֗W//`ߨ,@0g]?htkxRj\Hŋ_(хdB&K̒<( "illF`V;ǃ4I+=kd>3",9FX%LCE8(*܌':6ɾᪿ{Axle[qd$Rm l5tO‹)&@uH20mլR394A<. y"-S:#"j%|HPtFo^FK&zh'+0vPqȾA^bV7qI719:.] HM(؄]I׺V:'ħq?czϻ_-QT0H@ zg:Sja_R8?m}7%S3ꯞ}ԆzSن^Y+D :pEL(KJc&9,}a(ڦx* )WF{Ϸ퀴 i9k聓K ƒR j=CޓkGSw l6&kpXRMs(ۨm!{ {i~qo}Vft؅wUK"76;s> 6w]{hgGLb[4tܺknzsٖNWm#l\;Cnqݻݴ|x{ ;|f~x|}9]x~|oscns}[7fbMw=FXS3%O;O(|S'Uz]AvP{0ǂD!y4H%Ϊx$XC1ASA%$֋LmwTJ ֡bWP (! 6?Iֹ[B̂vJ'͘3?k܀f]iw OnDGY}%fz>+hKJC/]$ gHtsKm?ֽ6J'ɰHw]:t6$o0lFc]Ox(INf#zh!bv$-. , d B" lHhV*"^bP90!*m&>g!%$ *X1xm`S8v$ (Aj99;ʭǘrG?Xf$rhK`)L <B$˘&1mC ]_̪A}C.ULLDZd%VbX=bȞ!힯 33A'''Q 88^xS%cvge9IaNF?I.?_[M"gh%Ft~ vtÔYucᑰl TO:HB< G(%T^-8''>_錩`$gGqF}a]'hr6⼿o<֍qHoӺaaH*aMyr'^u?.zv9M#vT G]NrݨJ蒓f2/%}p6YQUH槔up=?+5w/ߟb-u?Ӻ U&QZɈli-?⢋BmaN|Q<7"?Ty^材|ϯx ku$yoz? ׿>C+ [z໌ZNy͸9 w)SUoAlG֣0O}|?ճmce  Ili〔s=\/$C%$gŕRea0۰.XDc`&KBG)AلJ )̜OƓS1 ()gkA2Ys1kFvз4>O!N lC4H]@pp85l*Q@j-$p$% ڱD@9hQ`w1"+OjEf|o-j01۫6DO*`%9BfL*-9 nG{L*!0e xgYyfaPDNAe [:ɪb}UŢhڮzwpZwpھ!YBDˊ3A4:WeQs&7##)XMeWHV Oǧ#~OQqO]tIMwlw:&&Tuq^?x نZq!{a26h oBϾBՒ'ZXu ޒSV#|۠eIJ\qϽQѣ,:hՍ3ۏټSIJl%eV6`Hm+/-w i?x/LǛ;ӡuLb=d̟O@OW5ێ{38~y66Ʒ/rsTm}ڣ^W٥2Ɍ柎 fTbW"SiuO%K$ 09X**/7,/\{ig, v_V'磅8s6R' z#w?W#['U8jJMT:}_7,kC-vCM+")W p}AtoPu/mofH|$?|kEuDuQ4o4Njoʒv9'KvSVWwxv!p#y}sпun %m/T*(?;M| .˖ RsVjXW#zkX<]76iY7U .?e.C ȣ,CCceh KRl=t1Bk*ki:._|nf+&|+`f!tm=ZwkQR ˫rNί0-(M9.L}ڡwcGII?7 ^t{n@&C 2J]n3 TӅb ɥo%sMYpR.69H c5OȂEC5N[f'g)psBe@Ooo0 ^95 և•5P4R;=#ktwE0\,KDsʹWc;ne،m.,"6tL?eϼ kCnc˔jt`5jwqJg<0ouv8bWav`S iܯլGؕAnښyAdH0éUc"nf*5Lgh ?2`%F]erpRWZ#]]!tgF'pHٍCQ#^B 7]~s9wEHh2cdRޡ%pxQ}4rhѫwA0 W7 }@QژBO%騗p!r]ugWγ+ٕyv ma)kO񪺰ih}mWY$SǓ)&lp;HNۣNT.QMn ՍM۳?6W~j.M8]4̖#d!~ RYnGvIT &%o>8xjIʖ@,o|U3lu3lfUޡȆ86iؖ˫pп yor_ K[eVg\ꪾ*5's #$$7,}25{.11=հt,{)khL浜^~y=?~__/rRnkMp *py?}EC?PDNbo^| ;Tޜӛ?}x?~xk\yV@$}7v;MM3Vilo4װIӢdIdWReڟ5{1Ym\ԼO"1p9P^|K }Ժ{ A >H* Q<{$c q`\L)m( A,I&dTTu 驽 3R7}84F rn h$a&F4I!l!QǨVB +!:E:8ؑ}W?P'n8Ȇn;ZyDyeҭN%vUu).'RO_xBMhs5a>@bZuJMj3ľ%S?[,2y@geO%m>qG%aZQB 9bNzcpezQ(I!F RF_1r}3xu1U'NĚD/N-aJISdv%B%gH 쏤x,d2$Ott&=A5ڔwO\=DKd)%mWDHj)J>)ǜf:WEIVdX 6@uF=%[Udz2FkC0hi]䖢hpr!N"cIH푌cs=-:X[s/rgPJ@݁֟:T|.LIE F=mrBKj]5ƞVSCv9$SD9]iwJ47&pʊzL0ǃ@$S|&dHf3F<"h&1=&9hjji-K\/FS"vlmfB.;:ƣ/^*ֱeBtZ;uvgVpGDIvV LƔ82eoTN_oHN;kd̎#c,o5ELhR /EJR䬗v7 ۴ ί|upA+0E,ZRrTL09!+ԡʐUtOeT.zCSͩ RJ Na}|mhkq-T.R'"%58T4,;3=&+ - ^hFZFK%׉ b0 UTTӢYhc"1MPP8g(@,J@mBh˭N*da\ _}j- eG&w*BtD HC5<d+;#J3fRD+?P"i 2O}D@1.@)E/,Y/cQQtծ7MJվ*'[퇺ij`2j-Ȱ@zkV;#ONpO|ښ7e1X{˖\IpZyrQJzPB(gy :9yd MqLF-@A \ߥfZsD S'zP!ǂAVż[J؄Jf,F*mu!/ y U)}\fx9I>p|5 oom?Oq)H1 eBHK0INZ`>IMQs'1=1βR!4vb&؎hF&J>¤95];:Ycrhqɠ5FAiEA9\#lmBPTE EaBhӂ"d ) <*5!/YdK4c.RH ITGeK#g>l@}0ꅘhbF:i{iqOSXUDlJQCu^slHbGЎCCHqNM򠀥g\p^xFI{D gΗ\񏳈SP/p퓯lEa(:8>bʿB#>J>u2ȁ Az}bAyAȄbC/>,CUXnh~6 [xϛ5rC> ~YME> KJچR~^Hď4sdӤ@# [aJ5STSԦ SHIlϓ>o}93.pgOT!yYq$Bc[uu 0CɗA:ɤ8W=R,CiXV8͞*ɢ6)iGy̜8!JJif!}rRuN5+"oLHofwO'_I+geSdSGq|w{y!-0@nM&# Jv `+'d7x{D "]nԸ \ #[&ۀd eӰM1Hrqc8 J!F)ȣ`%۝F qJ ;sP))21DК8aG%ؔGW! 1qZӽA-wS}Dtjn框W3ETLxcg%ڐǕQhh Y$rVpExF\P3GVun_.5p9NQS,dɶY2B)̒'@BVߚw0|!F_2Xa6C!Cz)j;e!dK| /4 PMF:ݽrJy9̎1_0rڂ`)jNЖ8lqK>Ψ,Ob,^^DzK~T^7}hbO(q~I&K$GUK!5ݤBQ+|Ԥ xy.㵾lBNWRʃUPgOmIÁmF٧W*.7]踏mEB& !:@ QFP1Mg RU&,t9䋫۪¥9<.Yy1  54/_>|BL`45w{U~n~^G_+f ~Ҹ^O~qdETvc}Gf-w`jt^O/++,׫̀\TUVG}=;Փ7RΟ<㟾2(Iq|E~T*9Yy)\9H% gx/ax wxF/{?)ڣR3ByQLY)B]Yw5c*čם[Yw&Lm;ho+lpVoyhsuqnHRL )SZ 9"M0>!!o+۰`[-n_ٛl tXb#7 7]idae~VBY%rVW\u9T7jzY:9&I x`GcM;OQ3e *d"H1]ng"{ڽvYT͸s vj}l{u$ɇoe0t[(ѭIa4 Emj (_8~pO_>i42]Fd?Lev_&J;/3;Z2$IVBn*^vV%qÿjqv1WgrBԴ>(~\'/<=eMeNԯ|Xͼ8]{-jlh.?M?vli͕Ż f/jvu0]ۣݏ)^ `:yp?v`o@m\#\2eK<4gkAN7u.|6?A(x\L!X}[Y}+>VT tJ-c$J@zUYW.Z:Sr)i0 ܗKrRYb2G'E-/&Ζ5hڎϻRI[uㅷﺽBmHWݺm_pm&I r I'J)i/mhh~g.vnxqf=/\hM55zެKt(>^ ~ȅO mPϚ|9GܷM?8aͤ?tuO%i* csX>VәFAl@IB(+%bho|NnuTU^@|k1߃Ǭ;ɲzwԊh#!T$##PܜY҈,JP. HČU U@݌ =m)D@+6Id}2}+,Ay4Jop-ѭ9.ڙy&;m[z&rI\z*)ꅞJC3Usv%XC˥]_>Qq9+-0H漐e2D\V.HdrԆ[i Dձs"i&zmr;WE,ضb\G@dDbKVr-t@Aꣴ Ci NB[I7"Yx&!RE&b̊lRBO9+[j,E-E5`Kۘ@DҺ5&)%0K xŔ .XGku`ցoq~Rkp7Qn2 ҕ[Nj"Q66$MJqo蓱ϯn~Inh\ҸR7C6TJ9+HX^$5>T3}g`yX1BI18%|\e+Y)YQ 1'9 .E%{vt$v$$4*JMˬ~탉F?}60. r  e0ǚאN̓)0GV5P/OC1;jn0kƼYn4䕪Qt8ǫUT`mȥP#E`'mdpHZ:)O'x[@(OH`'W$aTHUH)dWo S݌.k4֍'E-b-Y9Cz"mÅ jYx}B:p*LR࠲ZJPj8t/leK+Y _ g[;EJn0UzU:#nޝ2ŐQhߝ}}|y@CQGk4RKM6y\m\u) d9FR+2s^G=/К|{,װ\ukW]B: c]Jm:(pR&V SA%kt.!Kf#:3,w֫0N,b u2[i$ܥą`Q*Sw/%$\h-\Vmeǹw)1~x-K={rJhH*cYe=O%!2QEA6f^:b6Dv%/:}PIIjPg'f*o1I AagNecK,"92TKۜ}=k"ŭc:]zc]0m{"Ÿ |rȌY0+H20EY8 @GQk\V$AjA]oGW_!$Eg %)R"i)8esꯪ hp~d(aI*HLhp;£QJI)׽ ]B@rDH 8*(3]xh6w:R.Hq5WB\)V澳}k:>q6~M^J!)tJ[}Y72{Os3/vpaG8hR N$Wo /CGĻ))Y-NFS F 儝\s%9ܟ4^ۓ UI4UlSKuq̞bQVq}w m'[?kYㅓs؆ژANH]~muz1u:m a "9z1I }j`iDmAS<.?kWo͓7nf5D՝t9ӻ6WcSUFou>ȮVƪWw+iYO\BrÀ8 8't0nhQU*݊hKjw_n>}7 ~ikMrINm(/QW8G"׷UԽǚ^G*qp< ɟPE7??~w~so(g7>'|/q:e{w:.H7:OHκ.p#;2Xc#1?0IhARiKƘLLLJɨFS{T5‘g hH` h4&#k$0iM1$PHbƂ2"Eq:Eu*[V5dh2Wݤ|g/&~Rv|^()Mv"̇J*HJ< ֊NghRGT0s囈:CrB2+yjf&8eEkQJ8|,&M`E>P\T?iʠE}$&@=N#ZH`A Xw6yu=>N(QkA+x)uQ8UVJHK,$-TPzf{\CKHBL::jO$y*Oj(!4DKVvA* DH^Sf}R9͔1B2ɰl@u ў;Us2FVk`Һ-E Rz˕ CJpba<KBedl=-;֠p3w>;i-7 x觋$x9v+3 讚M/K$3Ay\Xdnb2 DfaBoDx;\tf>^z= >nθK狔u' '"#4P+t9:| #Ǔ`8ɵ]y7?:N3sDķ?8h%5`^rcֲ,|,o-=pּ mK)J{HL+GR RBBsOgl2(L gU09T\JMRg>9v?rl;؁Ήw*d `R*""+΀hitB(QٍnÂ)H VSYP9Xύ$$Pш[xFLKպK<^MӪmA>tg1aӯ]^]-̢ki$Bˈ4Q&F;.*))xƓEUO5צy҄-R1ʂW:Dm}QaT (Ҏ`a9eh=gu5O͓yݟ*8}fKb` }m!8'4.W1*8 !QK: bR9/.C mgҖm ,J(,mUj:+WfQx)5rFyas+B[lMI@Cj97M0qhs.933'(45RC QJ6yκӯ+MȆuk&~Nǚy!MCm<kM!ҎئikgNͿݦ {۴dFNݻ\ڦzV϶^Qеrw(Y?Onkzf髣ǝ׹JzzLkWp7V'ƴcq5x]2Yk&^vIB7uL~m~8jaNI3֟Ng*?w-S݌+)"8pZ~#=j{7׎F u.9H z2h[]a冣9GF6/E a5f3mUd>mSq Ν(an]-4G鹿&mKVU55' 厷C(N*:QPPEmL%f 42R"pItYKŽ() GAb9rp+,M!3v ZyDG^!8(@72^ՎFo<Ոs5':gaAT6^>&Ď׋>J{HBf~́԰ `{ GτqDj2Ώ?/o·J7/0yx/ YHQU"US9pJ,'̩أٯksa}E%o(%&2 54rge:*"\aom)BϿ'v $'Ȩxm}j<ϫN}-v=D_uf9gw7Zmw7vDW:h*97Qy5:h42~gv3MO&fb>yeW- OOɢד˓A 0 riL xBX+-g0BocwNKC{==g K’q DŽ `k™'`뤒TEBDґj A8O?ݑ.d<@c"& _=”.,Yp.YeJ`5X!< \ay* 酷 E.V6ePRn5w<;(TA+/ph%b&t`xTM| 'e-*"I!_lmǚ } T`A;d"3D04 {4S>82?z)bͺfh+*{3hIz7i"$t@% 8&J$Ĕ8G=e(2IYwo(dh(Xj#VGjk%NR/9c-,źY⬯@',u0M=bOHfMQ[e<8X7Q4&!iNLROR8RIܧ*(n茦o oԠTGgP#2(qDsHZ:8aVze 0<(d~@BSApJ jlXoVq$;.o:ٕա5Mn.U }sT⚜aun8r#Ea]SA2/ЇϿ̉EfZP p8XEMqL D ŲToRBrmR8ezUH1ΙT Jn)asHHiX;5c9RL[B^X^x";ȸ#yX̵No=5Y?MٻnW6GI ]5^OfUl˺\$H*ZZDI-b>'/㧿`+ ıReu.9䠕"Z:ĄqWmΖŬnf889̤C*P5U.|gm8-v:+.Xn1Z{XjBZ hL!y,@xq[9k[ũT=6paT}Uq#'o,Pxl =;b䃘rXBpls~ Xnq_-"qwo'/[E5*T7UbP>;F՘xE9|2S o,#e2Τ؋z(ԑL$*.k8;ܓPU'>:I}lhGxglDbΠvaCv=U,v1x%LZU| |/9 rFzܞ7f?>S _G6*7Mpd n2pf@SpݤK>7'8>^ye [0<^cٜԆ{%:ycM岮 A6eH|9J2{eCUPLB\D<*]]McJHŧELEb(A{C gD'{s&v,/UF, Sø-_QYӋ3.,~H`mi2 mڦnh "6s+lJXD C`"U0옃JG8УvAP1IC HT (}A[LTviǤc kjlJ~%ۗ/B.A.cK~'ex0Z3Υb YϺ gG=߁8@T)tiIeHNbmjk@刳Nrg"^>/4;g9TR!+MAL, ˞)_@XFjdF?i} AO%*Ahc-W ɠuΔč}.#z4Gp5t7ױHÏ =oG;%v/hDي/eՊdm5n{ N,ֺ-$ɍ/""c8+C+c*/zo(u}SC Z&RH޲L<'fbFŔqg#o^ NKՎGD0zS5۫$A~ Ŭm'qaȰLqL9hy(1䙒M<`ЏFR,15Ad}9~AF1! 1& Buhcͬqq=n(S ^.2Sykl|KK|SƬ8\jGw=|{gr?g#ҳ{nc(yBxhkݛ 'dxyS!z/;!Тܜ~߃+% ߧ#FֻPTGJe%T()UEY=19hqc_:1}lOo=rR鑿vz[a'.7fBv,9Ck[7=:Rd?>::tg6ĉK9p2:@L66A{oʟ1$ #$z$rOSC"d*N -F(z,Suj0`PKUa&ĒZ)Xe {:%ߚ ӽ*z g;$i ݳh)9a&iXk&U=?jz&t&8Bg>:XgC5@qIZ1f7xNq%a,'m|``5dVh91]*)*J*U )#YP^YlN47\bcD*ld$S'Tm ]Tr݆cښSÖ;}\k1^Ȧx$ k$~I\!UR%ZtS2ŒM&jճUqlY\2m`ܚ$Y<-z #Vi bF(Px:n 8(eZ1gɨהu'"yZhU Ŧ">r RJ5P5'ࣳ("S<l:F=v. A'+$H&BS8RƏWSÐ*Qt)5bk(`H*clR eZDvgVn+/3)7i}\?-~#--!#yGfG [" ^-GW4˻Xj n^@~tq<ɋ|{Yyu|=fdu'^|v;8Ti::yULLCpF9YB%j3HohZCTZ| ns?5reG_.߈vREM=Y{:ԍ \_Ees퓬`[ŶN'_W =Qث{]MrSJ`Rf-]V:">?=Ŭփf4ԡ:8]okR"fhެ1o~4xƍ몼<ϗD 5!惼Yӳվ/?IQ|ӛS?~}߼ײEyV|`~S"y";^XGnL}G 9\\:vjuم(Z$@ f Do&6 # 2Ԯ+]`(8+*LY%kuvзr>N&AC5 wsrP sN\ru~IoBQSGSQʀK*Pv(S%π%`"Cypz3T-8. ODŤNh@|&՞j5cYQLv=(Wc~$#Z8JJ( HŠhIg"69DuԱ=ұ83\i-hk=4)h FL0(|5*b7hX &h>t;P[ zPl VZ-~W ֋'3ʋ! 9Zr΃&#ƠI!c.ӵ*dO3x25>\Jw\~tgi*}1/mf~v5?BgzDp~)T2ZTmʏT? ^\Cnt|oCZESTK4C2:N&[ 6la%r85$֘4lCe4Td1y~8gR!u Ýhg$alt"wؿij ߻(LL[8ઽf'O}0<9.FFS]6zm4k":"b>1[:Eyw<rt_}dKc4Ous ^Kߎg DŽ` z7'?ԈK޿/ ϐq !ǍH?kWq}p"\x,I{C[/֌%YC^5f>EKJY+J ӑ59~b}sM$Ev 5hDKJFGFdB )vތ?*(fgPcPtrE*T1ݨ%)o (E c#crH^_-noc{v+zX9.ܯmዒ/-@C15A@Eu59QdU9DI-%b QM} HަH2L K-5J45kChOpk<;|*CAӻk߬~M~6l޲<`WLױ,lr/^Ӆ6hmf7_e犃70(!"lu3QdtNyCdPJ JA#2(qD֔T3n shwA &`BI_ k,ʊO*LJ@ Q6@-l"۱(G#ud4KcoH] FQtpN4(B)e$o};9K dtQ&oL2 煕^6b!J?sXKZuTQV5D߭e *DLF4E#XQ 縢қ-Ң7{P䬄Y?cCkZ|s^ bh},ޭ.9?rsHOul܊q/>rT>\)dZAgCZ˚Ȑd"*lh -.Ri\b5!梥b衄 -b;c)& B:HiM.!pNP݌+NJ2B ݈곭+3pY_ӎ\~\ͩӓnЅۛ;GlIh0Yx`òI6U@% :6Sjcdk[HJ.lR5BVBǨiɠ|¶@_d+HWqĎY31ǂͤcB~xH%ÌU,٘4dV`4ٚց!Q24F`H4l!33 k2)L#""\ljLpaoo]\+0ǹh#qBăubR(وmSɚoKe ++Fm9yB80l͵Ł*gb Y=IЄ)% Ҥɦ1"6 qį:2..ri1. '\<$(X=YdiXRzaqU_xfmTC]a`cqKXtWMZvA/O6V|^^nA 1夜‚Xɘy%*l0kD.I .U7b< _ I.$%% ,i8d[09r&`mÓ fqo*'x{r‹0˷k%Ww%_~M5M+_پ7?x+i(՟iׅ=`uK͢: }>OoOQ'h[Ni~{>>9>Οv]H{}=)@8={uB S C;ԭ$D%|Hfz5]z JPOi~z &.g~{/xC:>9>k&PN;YC=Ρ/|}&ěOwZ]-.WUu17f|&fCeFZ͖G|rO6 /W//Qto Ƶ@6֦ܥ4p8PYj;O [T9r,.]Qd:(e&f“A R>;:' mIղ#r ɻ4\jbWL=Ky{>^ԇ^la6@A$mt!>=o¶Y;m.GN 'ڵ'b;1ŴcZ4J5NW0ϙDA^^?zM~zUW>,|_x*nw u5U7\ߟoⲣҁRӊG'K/{=׈Ο'~\Wِ%A]OtDʳSi ]+|?֓lWqiբdž }z6$o/lWfoo}Q۫Wb\|W9P0]qǛQbi8_zg\^\m}_=2/.zSuM3ٚsG9YJv"dJI r ʑˊ1rݮ+^D?HzSȔ6($.wۚr@- F}94*"gEe(*u՝yGXlrc g Vw!n1,>Nv'\dYn@OHun>1ㄾjn#]oVӻ߭>+ܗyphut]v coqRMéL2knݺa~Rvg9~?5n#x^z~lsn,O>kYt.(+͋sABZ^!FA ؉W ɜ҈T[`o[uaNlgwP" p?+C֝ퟰ.;:Ďؑ`eG7w*>.Т33{݀e$:;DbFD4~DϡD({iJ^\ϒ.@B؂@ (^fɈAwmI 2RWwKq)8$ ܝ!"u$eG~3HʤȢ1`˚WWwWZNU"!%p! ºY(Ae QxƘr4it]pU(O*юgXlҭhGx=3iA^y&Xǘ=`g1{?oèUybF1ĘAcԃWg}ƾ"j}B%=\}p-"W`)_ \rZ_ fPiEW_!\ R.7V,Ug]r=9NVm{HFl;a׊c)џN&Yh$k?-z;L3<m%\æajFяvqjQCbЌhb>UQ!}^]B$M.~Ie"47g gj 3T m3`b.Rnʗ~W6j :k,VK1d9:yF>;Ugxǒz}Z]`xZE R`dŨ…\/E. wUPЫ_*/RƗcr勁BfW B-sey6GT]ɴ%l>(fgzIm;OyIF(RHReTh^x8L#vH<g[MQFKE/%q[zG>BNy@ B{n jť2t;7st80,,OkȼHd bx)#>1'Q"__%-|8 -2K7,wоt* Zo lr+Z-Qz Z}>]Ot}>]Oϵ#2UW{}>]t}>]Ot}>ݟZvrՐɤ UP +x5 OP*PK@4B !k!ٙ ȝ ɳdFYS=p'eN,g9:; 5ާrlwXnqM=}/PǓغ}|*3hJNW*BƊ<"k?V$Μc2̜W)ZveTOۑ'UUl'Z]͙܍!;i.KNR5*J2 f4hUf*k3GZ%Å0lj1L@q.%E5i46wO [R1qi5-i>j 9Zc"d\ި+GUh/"^L%-:ן|7m2gEcdd_I:#'3ڋ$ &rCh)Wx.ax9-οS`y̛ĵ"Ԇf:ƒEd @D$GQ$^Pyl}6l%Za;GS($}"<̘eR>;-uFD`6๳22dF,_i{nRK})&fP#rhȤ~PM(K5qADcz&RI) L/H ҶcoC=CE*e@0oͪn\{Zʱ;V&#];tÙd}QX4p\bJ3o?(\~HܓSA>TB[$`+ Um{8iLK8!Sd v^+!@C7s?:{F*~HV:0mfr-'4+A8J& @.8PC.\#ʹZ,x||4;X*> >Q1L hSϚM^:qFG.ը6=Kz4j/5qQyrv6pm F6!1Y;.Gv9Rƣ.g͙'|C;LWض$-ff4*%i4nFbYǣb6mk6.:m}L%'M?-$4,}64e?.窀B5n=>l?0|VwO|; aLN4INe0(VqJEc4_W: O4T}U*.F* .HHBy7˿o=9SON߾мRp M¯#1cYߟt9ijoѴ0xe}ڵ]vyK.hp1]ـ^vxi5_Pѵ.TZi_UVh]x/ HdIe%![kK|Opqm,CdQKl9ga.u#='@Hntb .ٌYZ+ ْ"JJɕCS9m#:-ZCɞ4./ɾM&}3X3{zo, U:>hq=ֹPs%ngZՇg?ey0%ow`5$RV"ZG=' sqc12G26Ifo=}řztmжA;܁XOP(=>$S5KlC5Uwwn gէ쬺5ۜnCOPr/ёOnbCAM;NڛYZ9+/tL UҨse}DSxXUo 78G\M};7>ƟF[v3 B!Z5orٚڸ3'tcN۬ݖtimIvP+Y;QwTZYFJ޺*3TTC}]lp7hPE맵JO:!3Yx5zkeلZK]9.G~㒘Cq+1HSRtl<06(L撹 mp,04tIRR2sÌ7`SAxB< r‘槴㽧ȹVާ\?D {zLm&D ,ba2 Vd)]S!zN:>P" 6$K(!Bd!(/ך]s.iR{2վDv';E֮k%FUcnޘ۫@OΚ譭߻Kesa<]mRKw;U@-ND&B@ّLNꀤDRS^|ʝ"2:ǘgF:ZL8AO &cGN%H8.HusPt ]Pt'kJfE3>LR{lR7W0?hn8?9b@htdVaɤ"K4rŤFh9ȐH=AXaˆLl*ƞ 6d{Qsu&ێ9ڄV2 t]+rFl?s_P3Q`_I㰎gْ6 HFal LLFm٠!a&#o$ YȤD҄51x&(RSvÜYFȨNڹ3rƨ/rAL;];FDG[dmh9X&I7:m $RU0X$ k1lcj\Jy.83)"Ȓ樅ӀuJLg܍M'\s'_gg\+.ʎqQm_b5ɿMҐ}r|F>p,2 $ZIT;]Pw;n4?/Pߚ^MgY#U_PmEx >y5CVm+e0Ox9~MaFu`xWl"ƒoAtIȠ E y}r $ҽM:t=pLaf@yR50GeKۯ%j?~&ɺdd+T1"Lfv茜{ƤO 5-\5>E︭oF?^^*à>rD+dM_oŗs{BBC \YɤkW e魔6rP`A x="< "^q =0<0w##Yg`m6h&(L.z@gh!v{^0hBQ"DJ'nF2$\II3ETDxhr{iy^8LZK:OA0@B+R3jlQrASS< OQ],|v/SA #K }m z>LJFfɈ!!#1rpw&Jw~B_ j"](EZGԓ- @jX Dv/uEFx>+qO6##uM[Zz #2$cQg I$>1iyƀ=?Ʒʢ cI1浖oIāg%VْEoI+Y @3*3y It-iߨ@>5.{ t]~ܲ?˪COŦj; .[r_k]|i @Jl49.to8n6څto SOíǀM@ūm{I^lнBu+(0}}ǭk咶~n#%͹QJ`m3<<&㭝N+ mjJ]vT~ãJdEZv쐎HnvT[璵 U In.%O 3&|$4fݢ 3oީxMm7\.nQ{ܖm&"tԲ@M&:j ocQS6ir1[/ݷf'NG Ft!STdSff[ihܨ ?;t<(29 ȆO^%jdx*D@19fӍN+vbnyOQ$* {`2R]ɀ3gI2zg-4eQ0޸n섍K'#`Gҵ2BBtGDj XZ殳uE&5.ieɗ뙪0CǫmX{gQq8S9u\Ifwspskwko|rdځЗ*:gp(d`Ni, x85} d@^x;F{{ K3hqXl|~eo ~^k,icmo߼}CúopN{D# j&ч{G Q6t`DqPGl r* {_=2!q ~!xAD,║B#9%qo´ѷ I?*\oFi7r|Q~F%4!)Leu>[~-gٱD?kstjk,WD瓫=&Mek4biey7 gw!-РݼNM됑Y":`6E͞qj<S`_~L+3d/(6čoÛќ>J<4kWϭUDN,sՔ[w=D fUIQIUKV1(n)}NqGl:]чvĶ"ZُضwTV~(!T$ t%m Ҙl W1 Q!21MhR<hטL'1+IY΍FrEB׷͝ԯ`d+'LЋPў1B _]3dzǻ ª/w=;;*⼣MDFɜae4L D%"Dձv"i&zLr=ID{9Or + RyilYɵb-mp6QNWCgiR}Vt"VfVI)}Օu1W%15C^MBeFzw8cHM6shM;OF )xk&}_ƠBF-xnbΣS{wڻ^A~(qy9:< R򹇛mie_Q,<(BPUУ (W 6|= }qzPUEi{jQiax\}#sDpEk GW\s4pEԚҮF=\A"SvUѮ ZաUR \I4SV % +WA໅n:Ëqs'* NA29wFwd4j^i,S.d`ƻo0o#:B0ڣi"ײ* :L*eoIFiN\\ָ>_H%o{Z=KMW˺OS\QKk.Pe]"v`8ּj8䕲Qtx9)ǫUT`m&ȥ2͎h%9N!("j⇾*uo$ȏ`qD7d\ > {zpeݫ.jqpF–&xb $Ep9ZHhˬk f/" J4ZUIOY,mQwzY"' aw~h̼[k (dYBHK) $3@ }_/|2ƀՄpGj;WP'2 <^am w=Цqٺ;0箭_ڿ;M}Jk[0 Ɓ6/4dK}ܕ,3ZFgF[§-x- D0j#./y> µﶇw?Yh\pA =cyt|J=X sp64eO@lm"/6r9;H$g=^:WdJtmU1w uiP)Y2%NR}ׁͯ *; @PrRDg`-fL ˂,`9:f̐@.Eɦ"QY+A=m! F&8%@gO @]-!!UbЗK+ጫb"=wɚfl5Pz7PV'=A`kR1g2VvT@8 *rI]::MkCtF^U1Gۊ7QedREKȒEΆ,w+ڄqb q2t[i$ .%.dk‘݉6(С'-XC[-Q]aE9RSfs> m:OaH*1x228_`)a>/҉3BgDm4UqoM,u d$a )"ʩ,"K IDr$IE`^oqOǚqZnh$I}õͷb\vfg`1 ƺȒWR+Z_r[c9 uf>U,V9zB pgDx1zfIf^8 L04 *ِ R Q8GG&ʤP "#͂DcL֚siFEH({Li$`@/K<*WcITL)Jt``B6ΙW2[W.?%'siiM+<p^}zL-gB1qףmU l9-ͼ737.?I)wͯAhV;mO,v獘z;$8JxdϪۨn(.&8㒉 v\Zs?fd2ל#buM5Mc}-mRlI͠mTyC*|`Ų/@9==V!Z{Vd̤Uq৭Oeq&<)qOdؗA]P+Vʷ_>eK%X| (c-&?H_ٰ#,#HcpfK4![(YNb^pȲЪT0T9T7Rꐝ_=Cj y)W=eca - -}4?Zz`< 0ħ2n:=W;doQ,#&HaJ=S!y![!wxP_CetvQrȐ皉W mQYD^&[F ,ToE`]D!rcJ034h(xeZwQޠl.Ǩa&Ŝ̂&;2I9@tZ3oUHQ E}1-XD̓p?^] 2jhJPND"BR&y$3rc84Mhoz۳n鐘4>pP\^x#ZeLOI\"E!rtBz5F1ύS 6|C;b*R$bYVJ2LhyԱ#ұeĦt;>Xf,QйxX;akԗp4ǂǡ*#qDĽi} ZB`,&GŜ06ru0g3YR |F>ll@P:3k 3!#.>. V!V7®n억\yF*>r/d&7F?>S#3dH= ==>QУ8Gq%|nI=qJ5ĘL"rɢ6) FGy̜8!JC%\V1Q/#4A TU>&Zb•F&28l5T ϥwX.~lc LvKk/h{j5zg#uࡵ6RJ4ĔщKΗ<%C+u>3ETTxb>#Bψ<84LZ%:hb2.d QX"l䂬I#TAB]yVv3#Njz }m z MHNPfh' CBZ_M07F+}7E5C!CuAe7wn,NVK5@(pw]tY-‡aܓ(x]V$F^0y!cQ!>1\;|}^JvL?j#Z4x؞ڜ}gbq>8iYgO܅ΛxŞukw/ny{XN?uo ԝ:=ܮ=qv˲L9J^3wwkSs~w[:QY=:uw#/W͛ƈy#")ōL)31d +29U<3ˣSȂ@  Q!;N'MѦ 4s1D%ʘDD 2圜Kɂɳ`FЫzeYNjIQknXCTN!/Z=Vs ? -Cdø aL~L/ HΫ'ht^^&|)(J 9 4XۨhD"@N)T>1G.>b_(}ŔC99Pd@,<+!`02] H3l6rSP8y>r6pC\+s/$Dg}TNAKQ\;{\ 9c10>>RksӉq~>UZh _sAmb8/1VOZoM $fW-_^'&ʴFQWX|k xhۼ lrR(Rʍ 6m TxL#fs?<ՐQzp މuH&{ !jEB& !:@ QF1T1Ng2RU&,i]9ŇsQ)qJ^;K>b>Mf 4龲:~5!w26 r}۫|߫߼?=Mo=,hVw~vEשlc ;*K۟EM +S[o=);=,כx@ՔF7SZ7ln ?RzH|:w㥜|2(Iv|CiTrRFs&~K:xHC=;@{Yj&A(/JJ~ +]2c+c9Mf<]moG+>2R0A kղmp/dQዤc g~G′rɍ+%3\8"Յ<7'_:W7л Blrk(49 "MMZyBZrjvb3ll4mbՕig~ms .;)Ny|s#3(eP# G'w>OA3|D8Vgn|%‰xZ5 ~;"DEv1Qi:m}zu|u3d@ka08|t*\eYKMN,aΒ~ra Z"0?zs z΍^Gps@en:I,uYoW`ǣpWl~9bmWCkdyL1l:"K辨d,/ r*iSj0k VA*ГњjX#9h彃 43V:2^eE35 Uȕ4[iLqc/!EsLeN2B ^@Aqggl9:Nq0]_Wa Jpwyax>k+:f5 @-k.=[7tw]o|P)w G\0LKsrv3=CV>o>>x]PͯtPכ]O4 t"Jmtn^:?!>:vN->wc!/w1tfK -woyqfߵ[FzӋiۍ'j~7g@ߙ\=~|\;+96W|7> } '1fM"FJ8JQU\R\KS}t$;5wb{,]͙~NH/?)Frd"yfheIjIp6[UkU1kvc6h1&(q(%.xƼV۠UP`Į6w&_[x',-og|EZ= )Uz4jUsn@T_0Ric$XQRNIqg:$ۙS/,oL#:Bc n٢%! q%E1ZkXIku!ٮFN_g*rhƷyYG%B^.Jy<-YN)d5ce`iSvl-&Aea_]J4)rփoPx]B>n^PkTgc|Lp6G| 㧆t 'E(2e}f24 i<)) 5}]Bۊ֋gp\/ڛbh瑩ge@+!/ST1;&\P{PW6l/-]l+ҲJ!킊6[&]44* c9P8!hgTmUrkGh3'd6VZ,g0fdǴȵNY)G$`Ξ٠Jzmlw|nyg]fcegYz(}QZdt9iXўБ+͚/'E} Qՠ7pN6 XhѰLYPB@"+Hdv 34 cLg  +dUNJ06u \H!JX:W&lQιEk,![t&zHt $OD+1i'`1=;XG{BA/8i3/¿z"/H@)ˍ4*=ś l_"/]qsWEZsl:EJ[tޤṟoUzr Ae8}GiM#Bm2iKhzyDhVho.0*h163߸yH.Y1 R~]H Şfyۇz_z~#{ڠ1&ä́01Rd̸ ~1Iyyo '3? zy4 ÿ[atw1Oz9 ;cIuYs6)=c8E ZX+k`&z7N ,2+mI zHJo%1qsLH30GHN&;yt&G_Ky9%{Hm+ۨ]1Y4epZ8'xQ3sB);#-< xׄ(Z\m4kJ H$[N 9yTŻ]!ZXJT clO923wE+I**rs:rX M܅B$\cf&"m]9rn1^9#0p6d/0A։[^5Òv*F3jcdc,pJ f3P'RJ H#HkNAPd$˄LXJF3*ۇa;:VdP}_e*Z."tLF.d)ECXjժ ]^G׵. /Z}j_}Ymv X/HmRZ!՞ 92k:Ti@LH v&khPzM)ۖ6ӗxFΡ}j_CUQ B3g+؇F,Ĝ% 8nj!z'䡛[JmlxuhmH[Q! +;O+E젅&nJd]щ֢Fbs9~-lrfMn ̨P"D~U:BV0KBB2Y/0 o'Ld 3誷7S ;`Oh?UC߇7ӝ*:rnU&D1if'Jń@_ ANnF)(EZ±pm/s{JE_Tp󊖰6dAn$ nC6tS 8-bm[;ڭ(mjNq1@.Xۙ|ۍd$˛حYs :;4:K,VQf! Ef S$2( oZ ucdŴܺ"\Ѣ:{Gd))2Ou{ֲJl Q[$.Q|vͲ}pS앢Z樬eu\*[;u%@ C@#u蠼~=VFxAX7QyPGPjȱf2Xm(#Y ͜mVB4# Z=2׀ JyȖ B2yc8RYL ߓQpۜzah.?ao)g ii߀/ W4H9ϸPc(()&8㒉Kv|f\c"r0Ezbd&hux 7]s;fL`Rܼ_<_0З6?u {кvZf?==\\"o)`-/E&' Wf`ܸFihɢ0jJenhvvozzIяmo#?|}20GM,]koG+ nZC@::d!S"5$e[[͗&[TQKf7oUW=un} RٕÑb4|3_[ 0cYkK)5֦ښ!HRy *h\`ŴG@Omm'ukmoum=DT.W49,DLC1ES|q}¿I(CQQ[P%w7:v^:NK$;aG0FU|0-\#z%<{Q}pKPo"_߿z{:/߾o`^fU.`.k^ܽiBr5 ͛ui|fs>rK?mKTjw+Q]z1]H;;^D ᾒ*ڿVhgTAgl#Mđ $MQk5 J#fSIuQ m>lKYiϟt0#O3AG\- Y(8Dfb[SrJKRXF׿Ng[_H}}|WHNz'qQ!gemRN7~Z~'DhTH9(-% }ї}\X)"uY" 8Ot3om@v{0'n8W(VsFg蹏˛zJlS^ᩄHh1r%Nsa2X3MuS,9DʛEyn:jX"#j0]mmŮ?o{-W*do=шe;{ƣr 8[^' ǑD9>T'D"iB$ðʧ+yA$E9}zk9F4[E5(4$+ٖd-6EeYn+z,K-˛!\>n4(7ѴQȿg0(wqJT, XRТޠJ Jyt:"RG]΢KqŜ :*:JOU$ǵ3UŜAbQ A;V-X1c!^ˈiDhnDdٺѠ MNLy(aC#c:a+,v 84"cNJ @K#2&)4 Ѐ]QQ8$*Ű[)r5[wv+j7Z&LUX<&<&Y=&p5Mn|;n'u59;o·ZE6vHJd|isIј;ēK\ZC=RJq* _Q"KLs8,xi9†h-R&|Gm X^Ȍ )ă93%$72fNٸJ>\+c_,XxP,\(=&1bu;flz;u[̓_Ppq8ξpVdJz%L*8A5`Pj `C`ΆdOP` K`!#a:0- 9ru՝݈m#bb jg[=j Ya1`ڐHp0 1 dT (b,+F0c20Cv4H$ǐ c}904 gn( r #$$XG!1pzgXҠK0IrǂgN8Sme\t֙m싋<3.{\ܶӫX W` h>}^lpDŽGZ AAK"r!pX0efYw)jg͕_*70բaf2{ /TԎ7u݊"b/޻IL]M :)rC5ſ0 1zM#Jƣ*4Vj{OƓw^]kMh*(hYQKPueO. |X :YLշ2UMz3uGW1Qө͵5wL ߫&b9nWck32nmkGuTWaHJɕy;yf_ӛ{<i3OP6N\SccW"~Ί4|?|I<bdɸ᥆1) Xa."` #A!fS&@ױ uځ`eyTN xxQKNϛ&M=+C#wZjʾ50&.wj*6K.ɷ#Z6 6r[,{!H)8BW6*17g1)=9uZlW8$2S"E-A1=:/M tއlFV$ 0呥)03M% <)2{GdZNA( ;A AV/VzI:-:Tq%=""GQB K3 5\F>_ƾ zh*(|F>-DhPɧDѻmUI*O*ǖIj [ߣ1E@74HQiuivڸp ܥ ҆H*e+t/8жr"Ɠ1 l|}]U>XVASZWI\0m"DrgNQ=&M[$o+Ⱦ#)?b*E+̼o5@QY p!x;PMZxjoa E,/Ƨ@ULOW!Z`O(ԨC}4<վeBtٴ!у]fԥnHqIÚ&vsckqtݢ6NMd뎲lsr_rH&39UbIn=~I1viYhJR޽0=鄪7⛟rf4M!q򖁙~y.4Q[zԜ.ܽ[:6~tLjn

ƾ >I{:dGUǜoyPJY9?O'|$Bʕ2_T`~1&8S^>!\\2(K)Zd6>Ł^#{.8(]rKX`3ȹFk̕t(zc;J08`1wa\+bK(F[ :aFhNPYsuKUdڋzNVgcv'U` n;~lb\gTI+NZ LEZHLWJ*5˜l{z0?\05yGMRq ^\d0?x ,U.fdrHx*#C R1=ҕu{\k]X^}CgAMLryzƌƁZ1b"$L#,x 4V߸uX*7.̺e(U5}-P i0*fp_T>&Cu"{R߆A{M}=7ޏG-Gܛ[Wishh{pa~罭>I)1#hSIr9hf^]sd%¾5i9a :3l :r\ : yR/e_k,D4Cd70;EմS3vi?9hƦ%y3>&Ԙ]0xc(JTϴbfc,bifHs&.VJ?~E|ԣxt8pa=B RCtHq)W)n2hc$*Ã9R Ao As $4oa8xU,:fm謨pkV; k5`|3ڬ7s<<[a24 u5]Vg֣ !H@,j"@) 8=Cwb2շz``a\X"y K(?7gDLR|6$uWBLRo8vbTH+$&TQ*偉s1uধ 2%5q/+m#ILSRtЍ]{0 !O[kԐ57x:(bI"2|ǓʊȈm~A48 "4g w7 rOatsZH~^bU!|_yb?s3~ߨHlɃo^`\:4iOh7K7ov@._2D(cMQY5Ej!4ex|Zr ! VrHXEuHX$C I GGJ.J]WWLe+#șW?lkʴ^lL+[ŖW5ӛc՝|Γ~t&PoVZ;w5:di5fV…&"!4$Es"+"A,U@(K sLgzجuv!ph:-jYw;>d9:tgNX.Ƀ*nw|M9Jq9@w*U^6Lfmhx>|tЋr~y@uS\I3P0G 6l/hI2V:m-&36*zLrw 6KD{9;%dᤩSQ -͘_rZe")f`V䢗JJ[^IieY! 0y6.*F l95&d-Ygl(g\FM@6DK+s0oJFR>cQrAQ!,+qx\Qoe' !%fE, /H,ybCΠV<vRo'-w anA K )޸9Z*X.űSY;Ug޾:r xm:gh,{ފ%[xUxm3ja $h\, M%֝Gi;䦖=EvM6E(~LS:##>$A 8o7/c({N<`Ӌ]8AKD'n&|K޲l5ėҵaZ Q,/SŠ릘*u[;{8\ev-m iأ#`ܟ5\CrRkT:{GHjӽaxr>ˇ.?H%&'/Y@J6§^[?Jx(s;?*K);mBlDDMFL SB^ir9, t+uHч$@J]%0ឭRS:wtFf-R*zzHFg:ˎSedg&̺NomM+7p(哦{o8mCO1+"J)w68JsvvzdXIWSp׻w93\v@6.oܺw-/o𹱲+wW]>&mL?uɓ#$J3V7Eh&D@"T<5$2OD . !X B1}v " 02SiDF9*v)P F% YkcQ!QduF%HDko̙XS^OO :@A?%< }3A%5Ρς*s2cG$YOIl646ID_N35AIpx-//(c &X$I$S)$CQ"xw$*):vs!LB"sJki:X!xm`LM IeB"'Kr"AgL"gÜ6ݥ)Ŗ<\Zyؒb`;@9O$B!HeR *bM69ފy^8pE.3 uӣu=rXy=O~]~΃'5Zu:0i-/94h3yiL8MJ :8h-u Ë񉷏<9Q?apwfWغ1S{Oޮ[A&gỈxT?VV˕]q F/g|˥bLkg}uӨӐZ;*gMy4n`]źGËBO/<>9>fnUg]ONcK$EI9[+PF^ ,TɢIɎA_gxx/4t dV pKČ@ *Eb ddT7Kz AzIIU"ՖD`s@Y$%BVIGȔPH̐v݂P, `{Ն(YcI"*KTN`hOIOMAZf!k/ԊT45Q[V'tr^$ς={Y{6t݂^۝A;܁ ~WDE\Ia`)d󜉹?n5<_ G :^6"hiCʾDN2E*g |Vgi0QXp_Ys .!v-Q7ǵFuﻺ})+L8ʚKg\. 61%\DIड़dPAZ!3VbD)jU,~G0k%Ɏohp (Eh g_usz*ꥫxM&KDB}#YR$-DRhSip쐻a ';}eqo&.pts-8>q?>% bL:8!x-efApfj V[Ey;5.1nZcyৃ={/"~;7ded}WFtMvf>kU2}ӊS7>bhBAYN~=ƨ1$>'RyY2+NZt'nj/Y[^ } @c(!yRpD(s#0E:w2l/N!4rR*8ƭ< O&ŝG9"=8:$&WUu:,-A-wꑈ6oKΉ7!pd$QGR&ARQ5G<* =#|mJ ljhYi+"E.2V*H.zJ18 `Z#}o^A1ZO@ihef?j24M)]J\QisRzq&J*}jѣCG P!Dj[v԰<'Y/]/ ͲG\}0fcD׌@42J-\bQ"82Q:.K)A,ZjG1ijZ]`-?ɋkg>Lqj.׳<_byO7e}&?{>-)*t=i7FO7?~O?D- ӾqqQ=^ ۽ҽ1ú'Yx82eC{ j?'ya0?> B;ShCC混\,]HmG)iﻎ/ܣN0&J׷#ۥ=Kx11>=_/ݯh>ʼnX]"(k)MȟQ"o9O8[Nm\P7kw_ @~sW80iW1 I9q)7[)JfsxK&yaX0Y˓2H[8+A 12p#U2!_6K\ݞMvql-/)U=Vs42P*_'ү)mSj)_\²jiE{Zvۗ^aU)'Z|lK9,vQc|y[{rl(-y:YzO+&xyR~\^&I(nV%8 }:8PZ  Q3 F -ҵ4kz?jz~=]2 -P-6thtUkd#6%f͋\3VwVwr ?dSq&`oBmUmn7OVbс~ǿzg;rOY&qJ6I6JjaH.f IN@~98 ܹ8|:%KZL7z{e3;L\ ,IBB)VR >% knfC4E)LFC)Gt3慰'3-jU0VrEvj `Cz6[49ʺYh}$y_ZSzWЪ W_!IW(˜]Eqѐ-~5BM\BtAZ&=q/xJ$qh5]z`_EG=9"M&(РE*JA/%˥2'"鿨*B C&MdNH<b.;D댉(Y𲲞UgzWf#=D . +%bzyXzF5d- "cΦF~oX32-Pؠff CF ,X\#ժF`5N*y{$0Bz$FX~8KXGK,B(6FbQ(0G8+Ojs(O;D, |؅ZÌ ZlQK(5"z0DVD5!7zgS|97Bg&F[OyAiCLAJIoGOFKSJ19G=ؒ*S_mv&ݲ,UVI6_%Kn0ƽog{]dRu#&0+ ͡K\5$/jxh5RبMCMq9Pﱤ n{J1 yJO'D~ZyJgp_Q] ts`*&G2U}2eٙu6({SO#cTh;~N1@MS'e&p6YQ~!a˒Lr݇z߫_?zhW$:g$t軷?҅Mˣv.>?E-Uίb~o=DꑅUȴ^iqt}uP%AT*` !C;7 ްNHNTy6v^}(YW,CKmr ^ +% j{K:TXeۨn1X2&1zrYJf.s!G<3?b5q'꩞Qn~.eҟ^eK }-V]׫YȽsP %'A ?5&Piݬ[W75,R"gEϟȹyz8?U7,JHꁎSl'Љc N` uMSz|U%[m_6L0uSdVFUa۠QL7޵q$/{ d~U?]An!sM I)_ IQ)1҈qbYN?0Y_V,$#'׎w'D8} z}puz%&6Ҥrlε dN@XeNK{Ѷ :dvrنC-mѶf["m8ievq#%ӏ2}&PqC$ NWZY)ϰBb~ C $ۆDzk=Ӥ 1ӺP5ONL|%0!*-ZϓB" t>EbYeG! DCц]!1[ݐ%HtϘktkgw>%#5Ջ`}i'\(~l)>,H_Va Ufu`]|펃ý]g8xTrVGdpâ9p̂whEVB s \0A` )q!XP)zI3\hC[mUgC6ݥ4:>X3,WhC-M$1 9o lT$c֠L`!fl:YCh)—:]lRٵe{Sec 1 sP S[,W^e눴8RܐD$O41j'*G属P4q05ww1}.FhHA(%D9̘c`Aހ ePxYYL0t}[KFasAjA# i?>: rAdC #!hwd 1i9v>G⥑ S eY GKx̘KŊҬ>/6s8S>?aiWgaiO|ڡVJŷ&k $y%fA)?Թi,?̒s#A13Z?C22JXOgRB}n  q)?6ZJ.þL .'SR'iٽVͫro'Y&0c:ޤ/hzTPN\\#Sykt;Nnu-t0yGO.5θ v{F+u][ r&M5;%6O/q(>(87jݫ^匨!~?,PZ)?~:r_'ڻ uV&!/P؋F&cG WԵNJZ[i4?lnB ɟHEyˣۣo>qa^׫o_:'i 4*p* <nW?m^mM͋VmR4YoRkkrqBNbBX6 & =/@x߿u>Xq$ Ҟ1G"N,D͵cFz`sySv]X&e0dsv&#m0W.;}v<*l'U$ly W>9 n*,TN*99t{:TT7>;;>ݵFםsym#DR^xzKE[}Un7WA?\tn;iHy|& k ((pj F |Gsz A:II ʋX,9cL)x60h%AEʓ;N0gLi2V * LĈZ 9 8ta"L*g'R"7`Gf--WnCnxg$='R" R/QRyÄγ;!fC~O kq + ԙh嗡gAJg56b4B}BJgRٙ_= /^y{O>e>])r^I`.-^?F?nmVKt#MI.i&f@*zARֲƒ̰Ft,oEL֍toj[l;EE0w{huL]TEk^~8?0i}3='=z޿5%Ϛ(LjG^_F/Vh6.h5^Tt<:rϽQ9yfJB5V頍t."0wUkE`6\zxZhHKsg];<;kҎ>t!U*Rl33q! U0!mL}Ĺv x<'ms!+)Yl"rQˮςY&rGw) B1xR+ƶ ?Gl!Mtlb !&i9ƙk?T|>.o0U?LQ.Mj3T_ڰ؅p'Ui*JB d![HTR\DCNВP+U< ' 3YxRE6;(#i(i Y,Ř|8\8-|q2qzkV [Cu:}!)ps+9X̢_Wy.ö  =b$&)OlufyORIe)d):L9Aڬu$ 17*Kb$YQ &'aے I 9 R>"[B@]3']󮾿0X”|Vlx&*.}cbg-5;῵*gfO]9#B:w|p_4;%W'Eh3-(v:)̶Gw|d:+NPԘr J.#[> #L H'!Xq&;`lp^nl!mrDq "Q냸`)j[FUg$^ޥ{XD i=",ݾEe](g&%)?&">ȶr9reb:z^UWN<4Y ͽXͥ#,Rxg}z^%rs*zB 4C߈ډ|[mspI+Φ !DFktQ)x袊d:IUrkY#sܦ6v}S֯I[O)xHe}i~cjՕ=e V=ԓy4wePV^_^ʠ$ [eRJs& =m!WNC4f"; BP>A` gf>DM@Pױ@Ilrne˙tx[v|q/Y_>|m[D| !ah:gVH XAfj@rɀp"3-.!Ykj;d|D rFu%͉hI"ѧѰ筄*$>\tQJ,Ǣ~MgNvVw* (%h7WF A@,CP׀V:,9ɻC<,NY%n6nlEmrU]yI?~ju@K"T El L d$ŔxD9xBa}K9} x]1 lN#-پmUn ^9FӼ.@޳-Pk8E,o|*f NqNp1 ?F_U/Hj69y?\l>jtvl4@F}YL@`jާf>u>x&lނ77Ze)*2 %+^핬e3?&iFDghz*{3STЎWKF*0"pg ;`mGatBif-H -2^W/S_lyܬڬ8K5oy,k&=P]{gJ>I奏,k7-E~8N&]DuiMx9%Թ򱱬AEjh3t-X6M JarbJ5g,ߩeeeF<>P94[LEt"glY\ R* UIɧ! ,Z¢I :T*~61 .2G3z'n.Zu6lުwHb]#DZOW؟ 㨑TG˴c S C")E%^ЦUAM'-\dd+w巘<$J!AZFh eAm:w j:.ia<׉Nֹ~ц̓+'تwNN#kN4BT梎&k÷<[ĝ䴦t+gxS&/ U|N4:(`ȤG)%/ ː06*`AA]cʭjFmM1?{Ƒ@8 mFCqplgm18!S"|[=8$E%:3_WYҥEP=+Gm ^Ȍ ^`A ƜMl[3Fe[UZӅqƶ i goLe܀,_: OǮ:_>eïçNx`0SQɔHKp,#Tcq$Nk"WQ 501Tk7 ِ= JڤBklh/#vHHLBDNEzm?bn;:Y߄> 26$5XcIQV2mu\NmQ1J &|g l;Շe}AT ِ>z< 6MxFQ׃:#%X-dx}ʎ7K"h~pT)t[;5yz2SуGopts}v]BLGa7r|}nJ_jknTa2n2D;>d68"Wn3Ã>@£* <(zƒd m0Jw$jzxܶܘIMuLxZK/GUƲR4&]h SfD ZRWHɬU{)abX.>m!зe @Y|\zpWUQ~1q؆$Fݫmt/f݃_8d vo2l^I~ׄY=` M=znăcu 1كXz糎*c'uĵ:JySj>eG@koaKKmry!YMدک_/Z;os灢yC~)YQP{q}·1fj$q5b[%@E:5"42㋔G(46t^i|/_^uAo2~d<+&f fŞI\y *Wtޡ`m5,Fj}aV{ Ygo&UiOlkɵծ|5`Žy0˜h#En̗q-oBx6)9x4 ;V4D.re2N0qK}DHbC-p'P<<\*P^xĞJGRA t„":׊Ky>$h7MAZNA/ڢ]x3<n*<bF`+x?I v]+{<1HitL*"fXTqA=`  B' Lw cPaPm=D !F`XGBe "B/Fr!h2#;O;ni[iG="4D \-X2^i- NmslELI9Mb"ZC{fYGUSCd  0mF5Jy=9&??HKL?Oc >j|L/(r'i엿(`CVEr̐<TjQe")}i)3N1V "a:>RLݱå_'/C_ջo%P0 *:mqkBɥCRj\b r5[ϝY0 N,>C ">u9\gWP#Ebٞ+dQ.DAdJq6iֽ0LXه¿O\R!31)EUlBk"D|xl:<7̖Á1 {R}3sl0&%>87~8sUUXB\8cA:s @E|Wg^~8D=هW`elUM$`3 w#`93_|hBƫm24o d\'7XvP!|JiF/o l/sۖ O#OđKUu5pdYmHG`uQ |6%/߇pw&"ך+%DV8 9|6xӘy6\j Z3Vt dOl2}>;h.';ۙ(7v>:vUlq`;@l&̒W48镁KKa);*`o8xkpSPDžHo!uՂ%JϊݬL^A_Ӑf%;_MhOA8r"4!)}1I|'֬P&'j y1iMH reOV Jʳ=1 @Tl( c|∹%XW~$;.ew[ lv m·-QUIUsaEku[Z$iA6iQlhQTdz W:$9Hus:f%/3><|P}?8B'˲W0'u൥ +Y[)jm$hCrFR*Wv+,N󻔲t: K/vR[Gލ2pQ D"ٮ 4<9Sɧ'9G)ա<(4Ã@ ` -ۖ xrgC}ub (=ڹӬJ\i2O$yoPP/T6*^'> ƥcp d߾\xz[AEo8n̮S &Η^WV+E 1je{| ۙS"B+ߗsc̱|26exɱ6$Qs-9]ǃ& ^?`O|#gnxAA3-Jo2K'.ަڣm1>dX?O8 C!]"68R$#jRz\Tknٍ;]e`aq57I*t/_/ @d~ C;βۿI.v1dִ|| ep99ڛ>9[nXl7٘s"3s \1ɍ.:8Iga*Eư`WS`XXJ°ļu;$%0P2]`b9ՔjV2 ? ؐΊv@q@:4V3'VlgE1b ,K GJh [Lҭ2@i (zp(K)7jƑ#翢wMY8 HrbWKgY^{}3#y$k^<ܻkK;fwSYUHVV0J:j̩B$)S{56-Sawu|AͲi+_|9U\eXXeb+Lz?yvh adRK&7\æh%!T;n90xCduDl<^凳Vf֞hw6{8]E3k`!lkХՎʨyk|6ŔTIѳ*!h@T4\zyuuuƖ=z0DGOmL3ptˮݼe2=̀hDLm$/'pmֵ;ߝe~zSvu? 3>Q;ed>Z**X\Akw@ޠbAto~};mLwξ߆UʡsAk䪑p 5gBFwE+K6V#]٣+#*nGھ'=ir/4#iícIzZ?̈́vI{&U oyWQd'p OrXVޜ{Y=t.JO3rcݘvu6J.m]9:Y|8[!\W8{Qi=v?w?d—TE&7ۦUf:Z56Kgݟ0x*ͰrKz{;SJEk2l fߊO!ف8Պ@h YT&)ZY)ф$]wm+y +jn-1<]O 5+boUSGcM1S#q#%]>ʎ.o 'jCAT57c]mmnX*q=)r31Kf9eDb2]O/oYv<9 anCȫu(bp?:ؿ(0Z=ب)8dI_4:(eMP¥}h_)*$) jlJTHʞP 8h&1zmmi9bfP[B4 YWNwE7sy~utv]l)ּN!iL&'Fk a  C N*ZM"< "^0L!$Jd]sM5D" [pX99CVt[M筧w7@%(O$ ~Q=omѰC=̺,ߖa}֣G!fE#Ae/`E!ġpkYbUV 4Oгz&ymZ(Nyqb8ZmE)Tkf4{1p 4u=[6AwK=RE/-˲BuB%!Lr+1LRM$0_ mbP2oR?fvgq` x}Cȁaz_rA'J5(!DYES YKc&- Zy0~LhZtifA-xR糏'xy?7=2_!]_!>Ӣ 99 |̢Ų?}o˙uչ'v2/g޹ÆY\tkz5k¿,ټւn_p!.s >n.:leoZ+;m ~=PE[}mՁ_N݄W:$;$?$( M@A,HQNo~@P 0DJMCP??e n!4.Yf4":hצ}?w [x0 QNy kcFk*q崋Ş1oxwzvhZSZ]g(ly@/=)790]j*QuDž!~x5l6rsf{ު!S&2 +C&UAfSJURH1Ztr6e] QUŐYNnx]X8K HgC&Ȥk)EP`JT'zm~XO6†Yp|zj)zwR'Mϔ)M9=SZ07êCњ!lk6jPZ$GYя3lޡsΰ̏ǖ3x5 "PQd.tEWUmȠ)-ʇ͉5 #w53ؾ-!EJ%eYWbLpJyb=mVe#M瀖a7hra̷D {7wF<٧y*?mVy5sHf\6śJV ̲UlJ:ko1-5/@q &&˷ ]$xBeB565(h.V6Σ٧eOľfj0WZrl[0RPL o Dc+B\,Fo:Y(gKٵ wBmO Ku /}hX^^*ёQTY  1UTZ7Q)$=EkYk>y;H>gWb,CM(Xkj-{-s(5ZTtu>']|JFb:Q֋r}jp>Ιak&ru[Gn[ڻA*~ 8E+Aȥ /hg[(Кhk Ta"IM""s4/C%=nݖ؄ F [1%N'FqNF޼^HK[kģx(cm+Y:g4~Z5m ayMPnܥ[PD ;HM)DR+˿N=_W~qB`9$IB/;Àb{;bĶDI*R"ilK4̊28Ɠvio}傳۳.C({+]'_G{'݋=\ /8Qk~]TJݲY$-eIwS m]s}mٛ.yr}Vl/DkQ=&S&>[3v|3-?,ұ`0`nf0^/,KOXA} aʈ`o@_o~5 n|#d0_;zŸskH0]lLkE "<͚ъs+6FJpV5ܔicХ-EV"[ӊaYzr?G]zKlLe`\QN;'<0XfgA6*fB6*Z\$JaS-#cMt[b>}Ӡ4輢"P5^ wInQozݻu~44Za%+ⰞVA>ê3wr~TpcYגٽv# mk^l{0",u^[ u "BIѱkb A0Ǣթ=uhGޡsC؉˒&V9:+UXiY ,Rҁ/w6$:%, 3ps18-H\dՙ˞w橒NV9Q-Qm~,mҟe[ ,|p*X%r-i]MB]s"bCIdп4ԢP9ήXojH+gAs|uη'רҙa2Ts߂ǣ]ַ]ѥhО-zM45_msm nL뚶LNZ[o6WRv[ۜx|< 4?Ч- 4]$U8OC-2+ԪAkitOT}3pYl0"4BYD7&0"ressقjFJmQn1Ox?=xFR(LG[+~ѿ4ݱCW] f9{}gEbg o?#SjN߰Sm@c⫖Q-X~\Fy۳u6W_%9&W ӗ.fRl +4r)!Chش1?ԅO,]Q'im.gkl!e[ %>lonl7Vֻ)⦤xOx ?2E;G^F3=y9\|owo1;q1뗸vZ;Zq >x/A9sr V1^`YiE5OA |88;9KflYo\ޛxo,JJ/;+b}K={fJǜr_Ч)Z2zo)KqoGwwH z|U>0aNȏM 0'&r%D؅Z#ݏMTcG?e+"9 \rA \jU-_=bLWO!Mu`z]=Z0p$*92 Wծ3CN`) \j9v"* U!Xۓ+"O ǯ]*znJr`GzZ \r="j9Wʥ#_ g.gF97r3|3x myS4 ߠ|CqSd7&kF4."6ʪhY{*P@5]juDQ)HٖbrBKW;%̈;k9XL6:l6rcAg>08eQ̕Ş8-gW׋ۥktx}X$O .ґ_2[Dk:jU`McJL%.K% dw,% Fkt|:zy錉Ơ msWc9A=o@`-)(|uo{C:uut U\Rњ$P+e28˓6BL$әTȤe\2:Fje}ϺGK %7Hnkb"[x.A`4+M]2:1$nTd#Ym$8%!.3ϵ'0 Ev]]Ĥ9c雨61cS8!ݲP߀vw\>8C S+֭ NxUGىrFg ́d!Ԉ`!Fa{"sc8My8/уKU%)~ۻc䎑LL+.hKm{;-L׏FFZqhDMBd9}m_ d%#J G@&D}Gn"h<$FBxcN7ҊȄapmLN_K$둴r ?Ήx8vf/\2!6;gI;|Qo9m0} ,$$IΠŔ?qM&Z JK\e Ff ~Tx2NVMc6EkY+z@%jXWnBj\ wلp1mTV !3QN59-FU-TwjP3(( ʀ"sH9Õf9f hCAVWUGuJ.%UCvM)zxo9@*\]&]Q:^i/Teu9[k!tS;y-^Ơh8{q< sa2VfF6bN)h;IQ2T@U0m+lCl$}F^>D!BDi p1z ۹"gIۿ4ip&qq\_Odٳr7j%ůDyuZx J {}{Oڴ:pcTd邵WC?.UQyG-j@#jddy)%gNP }0ϼ&E4Ri}W>Ȩ1fYڦYrAГBجc^Mt+JM>WRBj#c5r6#c=]V]PVBc^2هH&'o;2 ߝиoa6LbRёYMe&l,1iFhǐH${PT= ) lJ4 6L36 !e|}jxLcAjѣvHNd%m6 ";2m3YgR:†e2.xB&͐YA&MX J$a"ő0p>f,QFf3 H*x0rkXȈdl{0LS2KFܟ ! I&F޵#/sض~Lf3de,y%9+ܖ)(DMEvUWb;H}1qIYOТEf!z(*m bü ^*ij/}>+WmFF5ᚲ {.sE .s)(OkivP㻄{OuM[N3~W`\>Fi<80OslL_[1-3jŚ:w8PKW`TghYh<EEJ4-töqDqrݳNݛ]]W[i.vCG!a7M֐nd; dz@?rgeVC //&Xj#)W-NHZ(Ǣ?ol|uկyýBַiRڥ%#uM@Z5WKy~ zm!6orC9YVdK%UJw7!‡zREɴZ%n[_~Ntyi[T䟻0~4q˿"R\#7w-W8vo h0>Av- 3EwgzZ{f=}n.`kRLТIZ\[PhYE+ΰM&>;̦]ohi/lx0ufWz9yObnr+]4H)CLj 睦A)H.N1tqO{1$BflPD-ShM#<&o@Q |X8xAQzC:0-R-lV6۽TA\{]+t;R:?pSynJ+_0f5RVt6)x ZpD暣V |g3}3Ly DKRLS-wK@UG.&Jk.2he#1*mZER4L- AArR 29X[B,c{^89J,FNC{cyNpm- !-L ԸܟNk_ť_M,\<~yo˼ g>fQWuq+ r$q&3Kׄ[# 9ݷE E$ ghjTEe2H\ېb؊i)(nxYMHr\3Δ%L&, $π6ǛY 6JyY^XΊQ+w 'nqӨ/˯_^4QIX4x\wZʐՐi 1::q"k gD%M] ZEt"0rV1fÃtV#>_F99\F Ss!0pPEGNG{=ُ?[(|?T 1p{ e2qmArb"ENېp1Ry]"L mWs, Ts,"}d" '2N<U4ɩuJXg7OE%1B~rc\Cv]X:5a{lR7MVQ@Y4@",&lBe2)l|)qFă0]95ߣe%QWBDjι.:"C;626JNоV%Yq-R$%%qc( Dh,@{Ђῂ6*<QJOu3UˇiMPh;tv7wv_WxU޼q9˛'"4(騪nтݻz۫>ƟF-+ӾVv{ ims{-w\$'#WPm_klZԆUi=ʃ@Z\K?͋y:ni&n a$1Y{kȫerg7/nXhyT3ͷ_64`Oy!:j2Ε PU ^m:V*z|5Yz9D<_=WBPY')#4^3Z!F`By6 m}b'ZW,ҭRtd %HP+ V:W:.[D8%3xC-q)18-(K$F)R<*U9yG1r:'ZSQnڗ2/n>fz w\"6i/]ؐ3z"R6IZfʭy@j]2ݻn[zi\"ڲ絖a2ooT 󈶻 W8:4Gktnm[N#WiSK]nǬ38*XnW3%%6_eŅF3#  )Kk!9.R!QR&[}?::?dSmQ"K|'{{%⇝򒾝fz^mpVxߙ݋)l| N0-Sdg2PF,7J5%Q T68KfMs9hRet1Yڃ L& ~\-?xr|Z/NچZʳY*q&uBY{0JHa$h>V|hv=]Ú;Y& GϬ[SRe4אE {Rw4%C$O@dFhqGwnEt,Kԣs6K\^7itրnO'E*]:0jБ6t^2ohp93&yHt"XU3cx-w$o c z*H'D$B'''#dIy ,'(`DDcB".%eKuIgUl*@=ƘV]ȹB}D;5i։/PP#gBA7y&cGqd̜(nPF|ԓ#Q&xW'⼊Ny!#~1{K-DF|$NX\&ND>1w$IrvCٰq|05u1иp{c8SPQ|ԖhI"aJ)f3kώ[^˭n:f&ѣ)d iM|"21=玅L0=2[1AWӌlKdy-`ZKІƣep"FE@5J8Phh7zACa(d3v߮v*1LH,Hh#DjONKM,x欌8%^۟jkꔨ \1dȜ#!(P2/Y#R A)E)7A*+H$!K'B!L*CxNI F2Y4&7@ 9BKnޜIٹ嶄KswC3N/]N.%?p5=nu&dL7xy귟Un]~jɭFnV5E`NjF8^iN AS drŽ.љVSt ~x}4 8ϖE .ԁhɵmN'`4Sޯ++O5jiѼ䞜..oJeQPLφӣХQ~D-&+'q 3cBnhѺ7Ϯ5oum|u`䬈pFi88;,7+FW&52ji0z)iI֑%d#/ &$DE%;I|r1m,drC)%AKٰT.=,T=cp`HZiЖDlhRRJp bQ-$fV$`OpSLWuxWRg/fIН ,a; ۹qԋ[>/` bIJkytydy/?-D 0A%ǜǩbP6z${m!Ea^P6; |]\[4h` 4rzb, 2Z#~з4.A> )8Z$(eC$vJoE`D }b1V40w0}AI ±HH€4QʙuRAbB#8BD"2<}uUIT4$x@LʌN% $#[YT,W0~8!Y("Q 48u3'Nƣu$H&0?= go8mGnv܁bWHyKBjQUCoؕbq UBB1Z&IlԒĀՈcmwmq$Wz.vZKD^ 0:Kj$53&٢X$v:qDUFӲ`h(|N@F1Αwy|3LfOґ>QHƘ[$l<9C?tuuHF`ʠmK~"~Iu]| 1Ct_g)繐IIkjPVְ .jF[ >ػmFv,Oz&ǿAG47>[So7o֟1YoDcH',g'vZ_w_0-U[z;fY~m'vEFKX/AJ-a Z~ PkAݣv/_wYk*M3;UUQ7qDs t_4<X?h6 5@^j|ٍxx=~uрKn)l+VB]d%^hmE/wcGkkTdCw_][^l='r ù^;qۋ`\\ ߬oݟD%C Cݞ?K}#~\?\9g/y?;]s4]^n=v z;l>'pG Z.wC=CǬ+ ] \}+Aj3+ǖ!p^ \0lm{JV~3+oB4 zQCW{qճk"=Eww/HgҾ[zt|.xVj3?L5#%Ɇe?pGJbgrW"tP͝nglt4ޗ[5q%O=6ڛ/d]x'.}K+!jlv/쒸Y_`;uSO-aU1`,h@E~"By|,~BxۦGC?O4Hyf~8bݲk#ocKFdЛvĺ]Mgtq]H,eAۮEG| >xk}nF~|y?tz0G~KətQ*jzM^ؚu]f|BȥaUlS=^b=;. oTh]#V"_UWFN$wm;JU!m [i/N{ṵ 9 9S,Yx_ȧ\?oNBUE{r'gj*u/X7:rvlm,ׯiώNVsd&-Z0:{,qt&f=)[Zl(mߝTɌ0]]sòb]k Q^zhX횐Aiㄕ? o=̈=Xug8H!/[trM1WDsCrw,= gz + V7;"6b]f ᚛L!ևXyŶ*aR7HZTcD1(IFbw:H9- 槀*FeS*ajfМs3;k <7+Vjc bӪ %_6X28TK.d?mN^Oq~gP.'fco\ATcUwXU|NȴAXi YVL@: Z&[ZpZkċ+Bj4CS3AOȒa8QϚ{1ZfC`7ͦri4D- cV4k&WBdpMԅp$x !Jk|A:Mu$oW,,<!GRT1~{KB'WgqθF\˖=/jK5kkoW>"9}g7:.A 2tt+ yNzf=[:pv]6og匿gT&"Oe~7}ĵ.(v .-fx1o}^P ;b{(0Cj&t1Lg(ʝ6O\nf1 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\QpEWӷp7[F{0W_m+<4 cWj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(WHL<+vTp޹6Rlk#1#hO/pr$>Ge3;0Zm;ɵ+vҵko cZztnB (1ˆPlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ WOi ^2\ qbBZs7\!gU/ Wȉ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\j` ^R pG/v1TMƷpPIUMk{C1k#!a"w‘B.4a}]kd\TKFW CEVgogEJY^\MtOv2* Wu]v%Wi6'ȕgzlK ]$#W$Bhs+³\P1JBrϠ+՞\!UR:rC ^43~l0R\2&K܇ 1;*6VBUu OU*S;z$t~zEoBmBl>cK)K F֌DJmq3(.?r iL#Tdic]RI2GNZEr<?1x>Ul:Zb*3Tx뗃"El\LoM~&QMFa* P߱ex>ZB)!kɤ iCHوYN Z ;CG*r\!,W=+gw\ptqWYEZs+媏rcTWT AןV_ @A> >V|13?O/Ϊ-J/9 K\mYWf'^ThׯZj y\MM\*9.'BY]^{zp;pqVsk6_1?8P~s<~ϨIz%@_Ngj5Fj*ndfp'΀jQMw)n 7puF(ۗ(:Z*ؠ|ַ6aݼTjqv G)9m65`d]ʂ^g_~n)UF1bW?"|=,fջa>jzھ:ڦͶU'>gt^>НV25t ,HX"ecgjԂqF Brq"WHcr}hVx̴=#ei#WeӇxfr컾9 7t)"})Z!Ac^Z{+v\!nT i]2rC[^ $Wܹ ) 7(*rgW@ib\mڍu̢Z"d ;0 J͢ v:KH OFWI*2ri4e2>q3? {J38O#[OmQn_*׭^{dXTk.Vp##Ih+74ǥX٣X+<ߨ#_ZYKíK+їljd䒔Fx钢3:RIf!I>9 IVYEg=?y'BZ\!e0,W=+B+vt&7ɮ6J\!,W=+FBr֐+u\!m0RF^ \&qdW@EA,W}蝳vZ p7&zE&BZs+/Hb˦CCn4γ4ܮJnDs*&Udzl 2d q* )sYJ- ##W Z%lrҰ\Pr2gwt#FZtx ]rqYwS%&6o7Ŗ[1[ҩܹԲsnY:wSȒ^J9N6W*\F*0Sak©&')M4"Sd qC"W@]Zʕgp^pPt q"WHR:h\$W%B\:Zir+ԼGFR`q=MrH$>UAxJr{2rQ+UO}4r2]1Bع= XَWq]sW#(#,Wiz)\\!T i]ʕ[EHͫs+2P+h#C",W=+KJrQ*2rP+ugWH8\)<<;ǔCqK}c*A7e: Z+Ŧ oH HӼ1v5 S:$>DK(5x 5Ṳ"jsO{Hio^dt\Y3VQ+}v+V|1vuqRK6ڲ<N8?]hYV7m)=u=v88WCqRh Cz%@_Ngj5Fj*ndfp'd :PMw)oiV"wNhe~REqTG˞B~^VAqqW v=L5&wBJXz(Wʛ9i6+d+2w(UJ+rr֍ #GB j-PtU1L#n}zL#en}L?L.=rT].ǻ2G.8]z$k&%BXI&C\稤HG!enSq, !JJKV7+o]m 2BrJɮ.Vޓ\!~ R\R|)BrRɮY{Id]!evmU~;:+*r.WHyI*&$W1F!B\Ef i}w2rvrJlz=T{:JÍ]%Jư4J٩f:A4c^ $Wjj\!en'\=\iVJ ?d q"WHmr^iK~%P;e;s;*AF}NIVd}nS.o8;\aȱ΍ dȈe #sdRa J* Fd?rGJ)9a*lF\'Bڮ.JT媇r+“+UdqVg/WHi9\y+ւκґ+5*UPRIHXk:sWk5BZ2.RFrCJ㩑t # p 3Վ*=HrE]L˦7Cs&\p}{zhcGJ'r;B$ȕazlK #W<Bڮ(Y#WJ+e !p"WHtrZ\P1X̂\H=Rv o$[[!`#Fܮ/hI4R2G7 >5BS 6Ll!tGJ+kٰp+xP{1JXu~&o"!H]N%FpT ϰ%f_r6R+>wBȓ}+<@{BrF2k @k]Rʕ(eW, W*rF.WHi9\+trΆ8udH-i1\P59BrZ(2rRQ+URjirʶlz;f$`k Ẏi#Ae WM/zFAH`?iv}pRrCRRbp6ů|Vo`>V|1|7;C-9ZZj3jkZ9i-6ia9mp|۽#ݭIm%}Ei;iνFGIv3Oգ 3;րVNwcTJYU)l]:tc3Vouճ%8 C9Y'alK/t³w_&։r U9M9ٿOMWޓ s+^{G) `7Bh\x9m_z&[wb'qd~B#aC&- U ʂvAȐRj }\ڸ:h璃0{߬jYćqpj.YZWͅf/laFQ3?RwZfz8XsgZ4heSFE6a:_o2WzKoݙפ:XK< ?~XwƵCzF2!矿eb{#`1a ZF)?xx6/aZ=[Y2/3q t1\Iu7g~)T6N6.-%zfv;_Թ \U;%j (ޤI ͕ºIƉLfƅl{0%z8:~2tԻWا~IJHDe [+B`y2}zsA"ܐH?$b9hdJ"4S"똸Y{I`LFkB"p!0EE,AgJS6dhC Z#g7$  &mmx5&μl ImXcgX-{/O'BG0y-n= @}jG6iʫ$J4v) x:?>-`Pp/Bj>v}%{l{)Z1&3ڠ #-ɳbV;[Pa8* 3sт>VP ijEC Aʶ8UYzn5lƮ5ͼ1?|;-:~ڞgwYWW6wqwCfTFIl|!)Ӧu92Vpk-ue U2EImj4c( 5dQA%IIH_Cn6 '>@'>u& z#'oRXLdQ(Ip*zMpbG">.4TY,0h\gNs< ld:0kFnD{ %qכ# W{]%͋eZ־Z\QO 7;آ,I!Ydf! I`#$RB,KddrŝY U!#!F2^3KdC- Z!'`<(zd- ՟ s%L{xb_B76%bEmFtf 얓BaC\ jv C2ւsh IƬ")@bFCh*ʗx.,;Q&KD`.9:EU'/)O\S>sAE.rD=2͐J:φ {B2:pbZ{v%s  LjY RI^ CXȂ}f& ^] ZI^lHJso_쵥:w)/n3bn_aH~Kt~k-<-a~k)ESq&dxmRy_u.pi,?̓S,A9¸&Eq=fsīaTFs: ,%80]OF "x.}lJ&;*c Et?j~oߖ7w'W+{*(hnʊPpCT{ qƴa)4}Ɋ0N(͇JfOwNjk._;?7L' ob tp= E>%[tm? m9owmMRrS5|s5ofUޑ4i$c,pпweoqmMm5L[N~KHXy8=LǩUFΠ3l #j 5T+ćOӯo85˫~V,FztE0+k2[S47hnwBꂄO"?~/|wۏow?yO4.&&?_Gc^yUsVT߼ja>U6[=}um6yC[/w+6k1^ྺe e<,LfߺӾ0VhKqu>&?+"^{N e1Nd\甂1HJqAn8^v<& .$bB $R.rWlhֶ-Ө>`iUڹ~us,Z㓃6oX5dtS|Z ׈''u%ٽ%ŅxR޿Z.HbtF)QD& !H&T&H+y湱Q:Fh][ՌښcY4 ؘpR=)`y1EDEJb&%նfl=BZ*mj3Յe](NIuႢҶɴ u݋q+*x/-H0ҚȜ!2IE61)fqP!=v7 +2{9sڐi쵌h2v̺̥KRfض[kq<w }C9dGh LFրdf2jV B d0[,dB Ɋ"51y6h$bcV2AlYFn}X/b<eQ4I#nMc?Kι 2Ir 5lS WJ:Q+ cdB ^vuFmL pYHK4ldIsi٭/ '8fuLָPZ֋OzqǗXCdj!9I>y, *HHsZ $F3i2Oz)ָP}hZև.4? [X_f3ETTx4 =6q1l`0&t!+,GaWdgآ䂼y:Abp^^vLJ#k }m z>LJFfH#!#1rpw=M89ʺjZGC161$kof&;M$A5/ø'QHD-y(3$tISr6p$[+s/$Dg}TNAvi e:PWl_r\6__\Mnb,?^DBƼn ˯4QI>*H0zF`^ժ WKhZ4?(j𗏚NW/IO:f!'U*hT\l* (7if_IV:ުѝG)Hс2*,O╊t:2Y2\22ǡ~3GJ+$ݞr}Mz*|_N[,ϫ}`ݼ[w S(fa5ul(콿0ԁʳji(_xWnӻ1g7MWQ"3]'ߞ0Jw:&ku籫 _y=s;F%މ5gI⬎)`zVS+,O[;>,w?4}P[EA1h =MUklmkanDгBEqT/r'jt4Cdr:ߏǿ2QL\(써mR-E. ?4o& V&[ɨ1>h<ӕ5ãV8)ZTrRJs& ,!}CTfs]:zóGhotl}ʋnE@Jt&xPѕ0\˘83 !L"u6qX]o{]㋗>Z/{uV  )SZ 9"I@Hn vPNq6_׌i [nrao >oܗYbwgolǐEXܹs)FMrƓ>.]/I~:p"8#T6QyX~p䶯22*   l#"gQ Ȕy!(QZ@b1} $ۑ%kճlYZwbnOL\݃7^T#zPE<,$$M bY/ UNh<ĺM,\*QX,Z]a:}>cAKr` cXu;cg?jg=׹%WݳկH>Ŭjc_1x,9޳s~ht:rUUJ\y#x\,'59cO=TtJ-cf`N Nܪ^Yp*^Ԫ[ӤRҐ/ErXm*rcp[T*1Z̑LFt!/tvggl96Ք_xy=v5>kWnRo[OrRe 4,H7~Zf wՕ+|-%/AC.6p|cu5kay6e^wȸ8r<篝to!illo .4fќt>]*Szln{=w } |s(;|̋61STЖW d"_]*0"pWj_w.evyOF{%sOG^wUuo'ӝvGتK{Cb*.w{܅GK79ĄVh-c \0+QZp0%])AGo؅?$G.=J[,˲Π\0 ❻0IJjvJP6#a-QVJ&U9.YŠz|LyzþYkμN]&S-nN'(!T$ t%m Ҙl W1 Q!22MhR<hטL'1+IY΍FrEChٮ*ڛ#,;,-^BaZݙZ|?ibY4yɸg6:;mTL>Q)T҆8:^U/L&v^e z27[,)!YcR1V+w@2BqƕކL/SlmnGx E:)! 3AIf}h5 $w>=a4[rZRA=S1 LBvӱv z3- kp7]IU7Y\q}qikJmHy@lr`Rc;W|i _*t]}Vt"VfVI)}Օu1W%15;B_MMUAZ| u8c&9y`G&'Q/cP!<71 ՉN"vDQW+y~>]v%b;XZ rvL My®c)T@ĥ`0z偊kvDpE ®WW3+\$\!*pDpE `+"vW[}:ہ+cra.Uax-,\|=pzEmWObrq`zӸ0p4./{W pu깕#+bX*<*j~,pUEw*\Z>+a}DpE ɏ  WK+zp%%Sxg]-E_,ǍJПG6(WuяKulg&0CTB|ѿ2od|=km4߁È|dVpF응vs}v^>_Q:Eo.{YKG&2ҜOi4KkQ5 +[/sƅɠ~~0D P~c.CLjh ϵjt@WH㗟>U*W*0[yJxY,pfyD&3`ZK/ UY[M¥6ߢ L(<3b$;*io*. •f & hરˏfSp+tp)a7W!0 aR=qkYኸ p&r8"* sWpUGW[uUu&J/sLp%Q#+bWMnVqb pA/{+8ﭳ)-mJ?~?;?/#&$&XYf7]ˠӚ뗟65ɝשJ~W7w82vq7qzSvjUu^b٣wDPI/H\bVW^KSpv_g9ўQ9y+yY25= R5GVi|ɕyА5:wR#:/gY{ӷ%Aϔ$Fq;I'gSZORO0w:N &cq]DžnX_[s*aMҩA,/LQ6yCz80 fKQ+m=5Qualis]FKah;Σds9Rh1ѦdC.߽:1j+3*fu@d urWjDhY{nb>8 91ZsUWUiԒϯ==MK<|76qʺM(KA4EۜIYg,4D2R r@>R;&YE61b5]8gOUa}pxĨt # A u{$igۧbjSLG:SAd*|,ťC4 {>7'!hΪe8oZ@sԬf[ ރF%k :rNqN>I{u3reߠg1|1!<6$ VRj() AF5C b-uߺTs`Ok;k]Vj!G]1Ψ|W!'c#ZZ!ss`J"s?\:ZaPB Bт:?聺0ir@,x&THc?6Ɂ)BAmQx4&{4:BFh^viɸl5 "ˮOE/P"Hb1:cmP6Z"גuV6`ťwrV1-83ƳR++*?&* (JƬG5g~P6Pc@n7((Co8+MS`q\baNHuPvAjʝS3)M$xIӜ`د`ΫmO J+Y_ts͐jPob~w(c)hd Dck!$ ( "*JnxUuASзbt,r@(M&jZs">b1w W4 Ѽ<{:%+&CFPVH܁m CKwEWfUSUߨFbyG;ێnЗ$ `M0Ovߟn oT "d*S8r+ +tme,#z ԥ$>mŬ}(T*w}T 0A} $$LCS{X1z*Aⶦ3:Z]lJvl ԁ>%xjhjw cͪ)f^ZFq/DhD td++k6LE1YIbeUU|2$?< B#zw7dzތ,:ga1=TVh Aw8+N flNԬh" w|t֑pv4Yt j@gV3 o= Ե&^zk&1 29ԿYIؔM@-/k W LވK@Se,ol+7t<i^]. zL2ASH7dllfѳeYB(u5u kZsLڌќg5rFh bL PN2Frz6(2#lRӑwCRDHK-PUv#!C{w9v֤DVT*qhuAC0TP g':<<,뛷q3LƾXA|?uŠxZ96C $\C~GBɟ"o Q0 bpt 1U +(nFRu#x$MWus[`pO*6yLG[c@s@@୭ܢܴRHB5kՃ*H]6 |t^3Agg &S}TZ@׶䠧G[:?58g_% oԈ@>b1dMAi "Y5MD; Zv #WfBj~Z)A('{|tg= F66ЉCXm(u qL:5󛁷 J`H~֘-jUeҘ" kEЬ䝨*>Cl.Xvk5#[ U\*ygwb!h\>J+eH0H5zߜ^ܲ\^=*D좗@ Zy+ښFhN߽}F!=-2HwAAlyG>YB|O| \pHnQJtyo8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@ -o}֓@>AᦃqJ'zN u'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qy@>('{w8N `@@ԋw1$NJF'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q:2Vr1`o W h_(^̊8v[qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8^[Q35P0?y{ηYo7oo w6L (:q }8x0%%xJ;1.iKDՃ;CLWksׂ)zʨ^]Uzv`?b>tb! ]B2Ƣ}= bt8QW6K+F++芯>7QWoh/R+i_%]YMf2D3?6|0b뽕* OWە0H ebyhharvHNlWW;ZgwWi.O'~,[%/K#yYZQ ~f2xjNOi]_VL}@탓z@#6Ufˤ)hks_ۅm6ISs__z÷4Z[.'HoyGƹ)a0]a,k_<S"齦N"DԴojt-e1L7 gU[@nQc4z] UNVT`_Oknkz즍Z 骄mdbSv@t#I ҕ⻕BQc$8Q:6AZ/?]/;iP1+٫+ܡMP:儮^!]y)3{{8GC+FמNW2ZHW^[OΖ?ނ秫{;z/Vxs o3Woxj/|;=4c3e!I1(&?۹Ə'Ĺ7|vāl3q{ O{A~i#\$͕6ӌw/3q*yi.ު|}XVugZx <{“Ū4"N֕tYpyq:GU>{WkVOhݻqf6b[oo럚ۃa ߿~kyV:m7V0Eӡ!_Zb\|˾Y__|l屽ٌ쁛U۬2q5:`ow 78r>;FNz>Ϗe7]SUԠŭ ӿ[{WYw_.'yR]O7Җ@1hރz(?.èSq=WR:slLqϷ>3 :6&[ر>;D@l_Zmjm?WP-_{y_̿.Ϯw·P@7zc$u߬QnH\h{)%[ZML˭#+C=]9:hC:*w6RxOXc3~ɹ |BMٓ{)1CF[ʼӭJ ( GVDMIEEyWiDj]jAp43ǀfe.BfDo.Tu6J"͛Hk/ޝxwmK(yHA$=ax̰te?{Frl ?\ 7FG@>ll,rA5~T[Hlk>DIZe+˜tLs%0*Bו־ >amF2L ǏEϷi1.|?IJQd6)OtM-gB.QX58NK3o'zqPoAXq,J \Gianq 2JjP/(K&N.ɇxg3<ٜ GY[m5Cp%ϯ2ۻ[ e2p<_wgO^AN?·0ҞѾ?^Coޜ/N7?j.9f'q'LKp76\%j5v5/b)☑^s, =.acPJr0esv Fzng \z~?& k.E^;qBL }ߋ6mxqwV\tR]\IɊ-f%KnT薌yedt.دTYә÷6~g6&n؞^ӷ&Nt=fega7qdqtvCB{C}({-(paupJJIM=iNQ$4Lع؎n$siW:L(r!Ϣ6{bd$wךU0 ,R< zAow)Ho)c)H%.,3R"G@Kg1(*Rv=SGўUs6Xxg(='BT$1I HYd0!,ƎƦX ӻ_o Z{;zw il$|D#TSf8a61M8=%|n?ڽ==M{~=N?Tp5 >gq;-ÛoD^Y]hy|!%ndWmv B ')D^ g<ˀsI3d"͗NNn l$x ȥٶ¸v~F*8Jp254|+g`p[ޞ^sq<;H*c\AwTHCg[nf"}x*XDt`4]φY)qnz@mŋ S[іH5(N_aMY[PB>YuVܠtS ؖ`K<#^)։1SرĞhƾwgo6*sT H1 EJ-ntDr MR).T>J1T0me(3MQ* yZhdugG[A.jwDm\]Mz6F a wGO1_bW*5/uXדItWԼ[sJ/e_h[ui>qGSs1cl d4LǟX`7eC 9fQzYR]DyD Qeg2@*`Sb ($rSɫlDNmH*@%n@e, }8 Bg.Y&#g*j3e|lhruDjY>b?aWrJstgm\vI[毛"O:R{7!$^eoufyoD1K B dVfm#j@:3K<"m`> I  S"{>vzUn#^eݱEWn:68%N;8iGЁMt^~δ%t,_m?һ^զ(V;@^l2Up\lQPU GP>Vy @d *&A د#Bs`XMO0+CCmlќ CSgxax[k8G}gJLd 'r"ڒ(3&ɷ{+)سEnu«#LN+\k+{>ϟۉ7W'ah!{u[!EtJFAb1GcI6Ĩ$7 7*ƻdo__M+0:훧<w; {c~7'}KXg8yx>7*i)$\Q8Q4>ޠR,aMKǺiI٘Oܦ1 c*c*XDb2tAX_{r#KtiىlC.hq9?[W( 0X\%u%XcdYIM ³9KG.Ȕ@PYݢq i,Iܣoѩw9 Fa3zn*jIdsn ʳ;N&ܕk}W0M*{tE,_FYZnrDO5>-5q<\. Ҥ 4O58MF-T="Ai# KhyBt Fe8eB1^ RUA-JG G*\cb e 4 q,˛)H2'3qE +jSP MrB A@'˴c;iČHV\Lx6U 'Jl巄^DRIkR0XPug⇒'-ic擾xZ1jbr7Y=¡5O| ZonU\̝6"Xo_rylǕl2[ л_˞ƪۣ`2ĤRJ^W& N)&:`ښR"ڔ%=mʤ l sX;{TWU֬52BY eO{_fP8غvO]rd_hy4ͿrĶ,2iĜ!aIRR"9j*"QR3@V2"|L ()!i4$#\2. pN爙Eck^՝݈&!1{*s ȂgGl  %=IL6lAJO1xI@ !RQ!IȢ(lD RVEh|5jٍ_`<D6>ED#bxo?D2Yr5Hڔ+ۤY>y ( *d9%Ye{΄${\| \<<6:CS\h~[}~l*_sdFzG?P#(%Uc(6C鮓VsuJ/%1|,X\)ymC5 (b&D@>Fbőhc$p-a?1&(Q$.HQ8S$iVI%)N ]bs;)S+ hf4/V4_<ۓ#S) wxI(Hd1 < 住,"ڠyOe,/q =00pFmqrpѐ1h/{|RFqx=@NG 1-ǟfnSM>ud.AgeZfkCsm Lkgepsd`5y ˌ-·̀ =t~z鄞y6%e4^EcZ3($@JAyj0*íJ*=FWoF(,h.ȶYT)4Ih̒GEBHE4)%M+[}5vы#1XoRݗ  ԃHQme7 ;H'NVK5@򤻗l?+q`NrBy(8"&D(OCR I1A1LZz8IH eDHk*o!HD(&C#\w$gf|Ѷ#?JVar:|2*)5ƛ1-HhcZrpif?{ƍ)}yP.eSKmoIuH-Iq<(J9 u[SVQV=ɵ]5nǛ{;Oz<Լ޷yեy)5fqk>H>ԇ҇ VIz?Շ*?>BwUk..*` jojHUC crlKGAV=[)D6~Bp66mR=$ݚ$ +6KXv zBlAϹ`? ŷ [*ңZ \ x "~||CB;$^/!͖Y z{dn=u EbG9H[Z_!Լxde Լw܎Z5n۴E&։Ln:x6X33}x FȾ^t-O$Tu?Yl6(0BpYDs^&@> 8V/$ &J!A8hz8Vϰs+M:` LiP$9yJ@ 6F#,^/J/sOW".n/7!9!ݿ\pj E[c҄J Dň+e zf>. KD(QFmLɃ BH+XuX/ ]@*%Q$*srt)R$&r!B+'|o~Z,5\jG"qo<шq` r--(X;X_t vL{\ܶPmm<돷!Gᾀ[b spBթ ZvT U5* }Ȓ:ӡs1KKNfz<.Šҧ|ۋͶD_k& ^R3QUD @;0AJA`@)"~rK$ \&'ud,3RE#7('C]7r:T3D:8&{HpNH؏L/qOٝQk_ũ}I.*@!wm<%l>C˧ʛ⪡.g@MHM<"&i-Ȑ4X%Qr{^A 6H(g$Ƞ5TxA)s=hTzEmxnKxKp%DNJe2* dH4đ<29cTZJXz(g\'6` XAvV46/1.N*\Q4h'js!hOVg/,(z_Ro\ *ԹM>Re9T\ W^#  !zZ~9NZ C5}DD 1ǀ80*6LVǸEY]\sf8zQ E' E{><mAx7T5jDcVtLQ/% 6 :\pF ' !J֟Gi2>B/"|ЄxP@UƁ6x..Lrcg4Zj9Q<+TƳ7W:TҡD'"oN6WWխLK7ƞT@"G c ƙ=ؤ]pm8~L.ߕCWqY #9ɕIP2\ɭ!MS^ҏmzR,=Xϓ>{vƵ3Kױ .цiM6`"KsQqΘ OZ#Q!\d#|njYǘz b/ocq_&.yƎoX{y3;BJ:/I ~ien:02Oev˖úi݇il_ͽ<(L&oڊրYnmɻq -@jM+/>_pSe J4S%%Gԑn>!|gdϢB)Xd2DG mH OB aysTdQ5T6F)ՠ&&~ s[ < a)nѳ7r:un=}mzTjRǕt{6:] kk֏ {7>F;ǸDh#P d ifu 8 828}Ӛyt|S?]Lmv4a^Ȅa!kP(.gB;URƗB\-}ɔ c% J-k;\kXs/d:;xdU僅&KR1kdOB@t9KR XDb wtkX-#EQ09KtFX1[Э3 0M*=XpA ˽%՚܂+u:q3ZpXpsYps[pYp ȒHڢb#.I'l4a)%Q^-8& 뙱1@eh'2$9'O9 оPFn gC:(h$BFӛhiGCvԞE0-+ZjϣVԞGfò3,5=Zj.=Z)+"ʓQW\bNE]ejTr=/P]QI$'d@ɨL.S2ƂLcYp&oCDesR}Y|fVo=B4bh .V;O_*ejDk? se >!53OFM#\SQH`tjjKTӚ1r/NwIZba~_yd=v2KM9NT(,M^#XdMIlRERZ8Γ”Wݳ'^w EordGm^|8W-9 nq7<'qw~םncq9Yp"E%e$܃)pLTP9u*J vB 6@Oer 9b/S@'pKU&؈QWB19WWJ>:Du%V~' m=vl+r ."fky*oh Kv +J#% &B&.Efd;;`yP>Xr`1 y}eȳn?(~պӦ~]FpV>`H& >1H4.01D)' IupM)hohfjjͻ)dZ_|֭^nϵ\~ss3C揰{WTT5Q~u3{WT?bP≆-]dLck#6kFsprQ)%׳1Mg|Tezno<7K|?YJEۨ϶ן6[ q@|>r*:'PGNf&SZ9xL#QiRKvc1Mʘ&eL2IӤ;}waw`Dcށ1w`;0|Ї i.3'-k桠d/ˀaEΖ JG(l=+.r)=;nu N)˽hٌOZ3Om{8Vxy4ZAH " L ".Qp8;)α[]Lh*"6FB)NgNϝV;TQHQJ!)g9kn`JѺȔ\n7*=Ŗjwp E{Ϊɶ6HpwhҚceL+ǢP1K}H@eBcaV:~P"#Gv Z7T)JSUy4'mh1B!JtPMA\y  ê/FabHi bM.|0 W%J\D$FV bda4rښR`<#PkqH?9: 7 rArd68`ч($Q Զs YN.R&P F!#m&0 #$\vFV=P!`gOV뫝390[ǯ.YyZ&3o'مv)9g[׏ UX@s,~⭝\wh<9ԹdC ww= M hCwe "{}9@v%@[8R1],⍛|<_,Qޜlj)gZB*y96J'R qޮM8]֡0!v VjzZOr///7wK)(H.Źvض;H[ l][s9v+|$Um8*Lf*5[\`1HoI]) (k\M;׏|N.a&۾\{ݓմۛ~\7W-A^bN×̦O6oV7{82/wjgyZPΤNi, Nsu"Ky)Ͼ~pVwzC=,jR'F>,}|U,ʤ*,6rGl3-.P?Ro'9yJ ,o|I'L1T7&t+~GE_?nYQ Կoq:\ ɟXE͟O'{oO|uN|TZ/᷇-!_<}ƖO}[>ܜWZ)zmu@.0:/}~}U{N~a;u,q/$_@@ H̉P$2xaU@\<. 3H*K)fqHOml8W.]/=\!*-;"g7˿t-h x1h4>9 4+v7V`6Z3˳Y*O,:>Ĩ]rf:X!% Ԡ/vqf`YmO~v;nDvt^!WqX.'u`kEm #oo'J)8^<ͬk-hw ѥ/Ԩw580oY7X}5op]l+lE(pVΣtTqW*ȷT4gUJvfՋE (i *#j[iϿ$K2e1Oy'O|{$?AkyŤ!)ufd3rYZX 'XIy,5 }Dr|Iڰj.D5`W-P!-zWm߂b>ĬOzHAruŴ;UGﲮC!a3⁖>[Sw6 Y֩[?i`H̺3eWjCةTRq/n #_nϗ:n $#*/{[%EG$C"뜰$HYqx\bd x>5rL߭ƢsQEʡ(^i0E7j(SY:& B1_n=[1/_-3}z%[U 7y)ɔV|!F%\sP"gVTV9RN$ x *Ws復K2H\IQ#KFV )0e$J%LIA Zl9um-paqS2~=(Wn6t |!}Kݫ-?1mi9KoWۏg2֜zpZ~8vSN矨E-;>=ST] xVFH.]s.ݣK#JsrIL|ejX% 34‚b^IuzFٞsapQfIbIM-衘&{A#Esg=Uin-g&|Ls+.k|ynTr Z\9h7@ dxwy\0s3NHh |f||m-Z1pAuQA`F ~6ڐe: $gY8T)6 ҙ:q8 7t/?zg}ySV|xO KdJ@Ah$)jީڃ)E6-zdiz1r ۝DR 3mۇ^tgV".9|wǧ&ĩ;O2](QvQ׎61~L?${thIQ-S-:I%% SH+9& |I}z g"}a;m򖷽k&枽B-'d:GOZ:DAŲɒAP.E2I`QNU'/er-j}!+\ ,cq \AZ (&BvhݻrvK8;b.Cj/Xkd 5)zfy31Kco/0|u:SR* StSW#A =:B鎈PچI N N}qQ XRЧJdN66琳t19SRE&1zuM; P+14͖'Se6#m-H4:.#DE@N/PzUsĒs6g&eI &M5kQki/ K@)cqFY ɔhmArNK¥Ƃl95u%Ҽt'g5k~l:*yx׾@;V7aV\ŝ亯VDr{mH@Xow_J ZyEIIgcP$TJ)YF6i+ND:ѡ:绉4ՌBTb5W8GEI`ՓST|fJf2Ҩ %PCBZkf٣h3Ү|q.Tuu‹g[x fs7J[/sCf3VXc;]jgIxIJ0b@B@WAb %CUXj[*ܩt6WTej& lan'/}ޔHֵ.j95v\ <lwZ{D/Bf3:EFcښS¤RuCI˶u6 C!32YtRٲ, . Ĥ:'l9aOC\b<lucG8jĻv|p_4$EaI&-8J|D#%Pa+ JMrN afn̓+鋱RPcFZ]J̤A[ԩh]yrȓ8UG֋n+\dlKՋX/Gx׎X3:Fi`|KձIĢ/7F4TCMadqԋE6CXh~vѦdj:gNkp0>b;n~|G  ^L0^.=Wk{0^vvs"͓uO>h tgƀ?݁PIڔ1Fbcᘓ~]FY^>nu'H6>w>&RI*I)[e|BZdJ R&h "#ɹ?) @|̦B4+ޏgNw_xm<-gOgY((JU2FKjcIP)9]4r^nQ19@ uJ!/'8A`2P:8!-~zzxU!5IiH>}nw|l v ^ZȧV<|֦(uCYhg28aC &Z!Y%UQrTQ Q|o'ԒQag B*1N}AjE. Q( ̞(GӪ4oYotRU73k/0h "}mbԎ=1I6g32”-X0J}34Ñ}/סŀ?Z v-^5 "J o ݪvs]j\ oBG) $i .Kk#֞HjjHdN06`"oTTXrrL!Y9^0@f;VΓt;?jV~z:/i^^֋jLHƴjr4H/N~>>W=4}]m>eOlh<5>iVg!ݜC3-9e̟ky0/}c . b8;v`塁6`oχFp~u o!xB44s[gpH{dJ{F_!ͼڽ/ɂw;Le:(qEDN*YIImfV?ޒ J{Ӫ{ַ- c/L\/\N qzSrϪwˮ\2 ?B0Ak NE39INw4)1 E uC^m= IτA>yu >|xr巃-xxv,/h]ۤ\]aNO/~Ӭ޻_fxn ?~ OZI0 t"yQdKP˥K7D"m;lHh7i ŸWQfIAwpr#wWIQLrJ͙"x47,VHӴQbd&#QHYuHzg՘IDk1h''kf*Iw ze*U$z:UFpkKN߸:{F_:{s,\ <&@(^8X+t^1z1E!|Vͳu1^-~Cx6a0|`H1^\2PY6)sW;<&zSMWZm^<[TV=1o{v=L!YGH(򹖜N)bxЄY\Ch .B{ Z^KfZs*\`WC1WV tsPj՚'h";ok`\\ءЦ]}J*fg{@?٥ XQ:lɲ\(Oz3nx`9A+Ed"'+ ncxr.E%`y8T2Uc-CbP}Y  /^a![~5?8._q`@x\(Cq4+wϟHN9W/Tsm~yxk݄J7:Y8ۍW6G1%_veaTn ,v[rtrtG2\t`^^BZaz@\-` WKhn:WK(y)r5)CZ&p&`CZJP`:hsz]1=R<ܯz ZbRfmh}sUm&]ZX=魺U塘V򦛫R\=EsE1mʅUF]ܛ~22I(+=+@KGU$betX,m! {C`//_b_ّxglSٝe*g(+0bj]%7ih 5(V8dncWnk.F1(7yV9)ABFIQ (,7A35EQ=#y ךLr 8M(rn֑aYqGkZ HjNPC-ZeT| K?i1kG_|7vެZzw ƻ[h=tgBΕe.Y8s~1udےbR9c6([8Sp7IHҐ%m_2jO*?JjB##2ԱETS9& f2 apѱigmpw,c0f%)UZ b}JL$P0F(,eZB7,t2gҌڊɶav=lV6# .ImR=QL%MfH#eB.Rtw;p`0 tZ9uEefti (%KLC@7R?2bVao)WRr.N!鑧\ph ^Yà/r&3:R ~-|/?wN`>p7GLgeWoUJIH:5}1S,Y}gp1]2է[W= 0[nh0?{Wƍ/*Cȇ$M_4)Ck|u%VN oڕe[+:M咳ܙ3̢˭@d$@&H48Akj . eLQt2:05db|عl54‚g"~tTL( y\(l9yWozv[^ɭFn{Qe^dG{l0 Ub"{WⴳקI|h d aPNބ9~EY{ˠOho>{/Ƨ|ޤg\$!NBHh<=]AU%U^AyULQHyŏ߿7G_2>i";\m?4cm Cs-ZՄo3i긴*YYOü ;,S-pY;(7^eMʺ Vl_6j#m$$AQe19t1ȡ.bɤ D~)=%C _ ` ً[m4I$ihKiRRJ)D ͱX F(1.nO="&wd73 dK/k'zp|K]vvv@R]o ~'Qխe| ,0'tZhTόP'I @n@kT:(3o*rL6C/|dn6?d]vA> ^$%/8Ο' QfgkrWv%R%noq|M9'߆w`=ʹQv1u~@ /~|x77ko~(Z7KX*YfUAy2!GaZ'"0 f$`E9徱AY0@ Qk"tD %YI[6Z#gT$&w5;HXIxth=px`"bلtנgs*(z]$ew$ !*BDjoԔ x-mƉ=/ !BCՒ((>qsA>w^".N4$ka1$l%\uo)cQmd]Kk.rKQCQSz .V'XR'c;$c84osԹȅʔb\+̹z4Z\Ac)ךf˵"R\7X1Ih}c*E+׃E[ٽHI}Y&GQJ4usr].qxGsKՂC;NL4 {X%4*ץSU|z+C8L+"vvK\̇=lj]L?zˉv"4hIe%Y-$Ĕ8G=e(2II<29Q1F.FzI.Rg:墄WܢP%OlۙmYWY_OYjY-=ZRq5#l]9Bom4vG?J ӂA '"+\Q\M}UFPߨA ;gP2(n͑", ^j፲rjv0jzi *=DB6F%Xcprh✡BD !gZ)PPWH1ղ:9[k!{tSvuwL?w9/N9 (C$5<S P>:8aVyZ.'/P2iK 2O}D@`\rUN@o|[lnZVG4Nn0eH|W':CxY+;8FyyԿ!k#Om'|ކl\ԬݐRUT"(/@Jp­G/P_@aR]ZEF!#ZZ\tH2B2tJBIT F@kJ{b!o y βfPc~tm, _횬yb#BGAh M*$ 7)"uҊèUjCk7<9g#9YMMcDf$6 QD<+EfvQX21w[; QaLN8edP4ElMۄFmG/h;-@ 5C*Њ<bM+ ǘzBx4#ض[#g3F}],+ bkq[D-#Cĵ5h\`$$A,6(R CuP^slHhR¡UD4@msj2θF{n Ϝo sklF?C\,p풯5.-.ʖqQv:[oh*.J>u276ʣc( R%D4;\\_W6},~ɗ#(3e$@n2<3RdmLh0Zl>IH9)!½ B劍;`!2 (T'5&9= X"#68*&$rJHjLpre(wR9KXZ #4\B|!(Sjx"3JLWy )&)0è@H;s#SpVyZA[?|nyU$%(M1O GB%֤~3/6r^KaZ4GCV6u#mn6 ߀Ȗ6@< pzx_I}0~ڔoжgSMPVE^QϼFڸ84 t4f8'œ^O{'|?̙QOOOh'h9=Ψf<Ǵ(Fy؎pB 9콉2{?ӽ:QvYMg^2eka8bqrfݳ׷^G)&_n3;!l^YC~Lvy @ +Cm.Xbu%1t{`vSGR.:ܑ4(Ǣͯ7W~:Fao^8l͖}.].!l._}~eSozAX)Ħ| bNZ":/5庈Q|n}5z1@K\?G+y{Jf^J6~V_?Q@G)!|:1&EJ ..$&$==? gueZh_0. TX/8׏%>{|^doOCLlaMޫ矖?Nݛۓs룊z돋A..gc?.nw_~Wcqpj')~OuwO F~.]ţ?~̊#X6/3,m{s2׳uYsP= ϯM70x!_t7e sǟ~}'K_s<H!esI} RNߡLO9ϟ/zXovRvRZWsE~2k?OgK5z^6XMmUb0_{f,ے񅄹[.nJ|DDeQ~;g<Hfb-ʟo.;6޴f:2AU{&:[\{kfpn ocb&z~=k|@ƼӮ~{h]mv;Hlqu1[dvHbC^JA51'$D)W%)$:~+Ŏʯ/ Xɾi4JLgz8_!n/7yHHX#M^p`ZJ&R/S=R,R9 sz}*ix0[+*V蒷)\ #On:rfxs{ݛ]ƾ=dǡ=y ;ą߫Clq(x_co(7͞6fnR`A V3t3Nf,2>V w7Lҍ^XRZ5o(2JkX/((Y7HǾp5FpQYpR|@瘸3ZSvU_Usv}f?Cy Zvк`кbz"j.&tlr6KȜgҜwfuAi ( ,Y#fBw[ғm}b= vѺ X&S*/Q T(psat/lUS/Zt*oq)*7|դj<ǣ*WG ߫hmfwkMS?ƗamC%injw3@hn`QrĸmIШ1}קW{C3s ..EzEehng ?@A=gQڤ~SE"HO8JfBAx+2!H%!.MsS}^ȍ/)" !V7P2I내|뛷awدo"ĉ!fG.(*I}6eLRFY n ;fwWWYWZHDHKWs5*M.xtyəa¢1%qSGm=m}*wrV Vv8SrϹphxMfPWBϘ#xLc46 PMbE: s 1zծEE(Ysv̩Kj'dM?j[-{兾kֱ֦PkL+jR1#40n66W4j W5"Ϣ敒a:lo' qy\זg[#&9qD4_^zVz7zʼn{/*eE\z$I"yR QI遲\ya#&%o!avr}#@Vd(&(Iˢ *S+aIVZSDu/AY\ pP}-y\Z3&Ygq5w*jM_ ki%mK%Ҍj.<P֩Ho$##ߋa=G#ZJBipQ-VsH_%CN,X\ j' SQ6FaL5{p|6N6^e1L&bn><Ѳ x5a Ut^آVe P2Th0D^M"ҚaugNF4ѳ,Doo!u:f:cJ@+b=%E3/'RIWny,bVGgW4ΪbMjD*š"˛ϵ<.zxpHlNdcp~p6Mgdp6M#l2dXgl28 &l28 &OMgdp6M68 &l28 &l\wߦd4kϾ7=%dȂ/Z`dV,e~pE}GuEѼ?({/bY Ck+rDRƍ^\+[@Yy/RNsǮ(&&֊(YO .*d[Ri/JZܘErvٹͱ39u֜.f8 O%ă7.tI)](Q,3BE|A!r)ѼUՍC°XX&Do>7YeAp|3|=zpz8Ә[&Ŭ=\*\m6E<63˦Ί[5Hխ޽MSF7gzo޵[6KlNTu@h{յK7{E͠T*{樍J3*f۫tY{[GJ2*ZiݽmUwMC+%t2R>"qΦ:[^{lz/a)/pTyIz7UKKBU6pqs cqǔs]7~&>©x"ڢH֗R4ĮO{Tr$7?.;B ƀr ].f EeHSFB27 O>Bi&o&ޏ[A[A3KW@egrQ  %d/QxNBr_rtwT(L>wo\{.AǧJ;PVBYځœhԒ´wu0ux1VaLs6ʹ^4߼_CE޼xKf-4?iΧjwb?Zw.a+GxsӺٌ)TЅ_Zs0l_l+}@V꾟H]=|rwSRh~ t2}}]di}~SpA#@a>8f]`рѪy 5ːHppϺHjgOܜbC[@|-eӭg4^vkq6EL3"RVyF89M W4.8ep5;D&S ,(/;#sA"ܐ<$% mxU `\H:^≫3Hٹ N!ZHD!)0u[/ Lbt<(s.i%D';D5qHt$A]q6#ɼk ƚxdK'BA+9?NU*(QU'e" t6?{WFnlRa$|5pU:ʾds+/rhĘ"rR%YI`s2Kƺ|Е1[򑭧" D\G+h(% gWr߽;ZXo 򘓤qc*o7>yDpEڝܸӛkMۻ>S׉egNoO'''kՓ9g?2.-9m͘͠,ټ'y)Y|>~x˫6'}7qkFMnku۽x5͛]/,5 2|<_4ϗ` 4>w]ԩSIzo?O:wEWH":aҦ6gt@_9g4_'˃D߫+V_d>|P&g'b$?c>Ӈ~6}ᗟ~y^Viۺ7woցoy_4??icjZڻhaȻ<Ǵ-vQS!$_G~O_> GޫnrHz\2B^&Pc-x&W \ZU#V(T5)SsRw `IVνR x Z `˯՗"DžFɕ9rj+oH6kLM)22l$v Eɵ ض+D>aEzBZ1]qfCR%&}uaw!I+\S:"+Qx8 {+2:it8!^z]LpAo_RRvRK"'.ʹP3h֡(qq x&\S^Cf **ѫ!-\O* V#pRSh*%Ҙrh/'Ƴ6ɁY|&- eb."P6U "mlllmku4_}Tʊn ;n8)ιcSM'uu%F) F[![Rtk5D!p ]4ti?=+e*)P*JWXi&&vm}e!r@i`5]0emQjƴc ѱ-S(8֮ T1tY6-*9VR~L5Hz,ABR%Qt\aݓ-tٵ$/jq,%dXIcD1QSٵGlfC f6Ǯ ֺ'] nsg YsƳM@ٹcv2ٝ wB4@L45#x A!)u SȆh9߮{ Q g'-Jv[ ݚξZ%dzaݛ/,L$ѵmVdeUJ"!d-9UEd01Z؇5b/D"^Y&ZhZV H֘5-ֈթ "F|ɺ6Q)tcnqSh_;遀٧-g#ukmU75K/T[FZ5tۛ!KChw^X,8-@Nv9֋>n9.bwL6$UsE.CO*0Fցr>DͥB X(@hlP,96GrTŦEw K{1Lc7Fmqo ~n;m6Ϥ6ZxwmG2}%e lTK9^ep<:W5ZjXu+]ze8b\6.D.ޔbR!X28JJU)ǡAV W{rBW.ȗ|}ӗ^sX-@Bdg5%-*9@.FdhB.F&;nN^`lY~- ~`p,;o^IVTqp}sʉYsJt>AR?ʌ IvbS8t88MK㦥]ݴpIjI"oЪZj1rĥ81yFCUc[݌'/z|to.'tzt:Yȵg } *@Ws9RWAY`6Œs MRTE1和GV5lQ5p9E_[NպjB \F4"`[4t.ءɳހ/oKuyloksxOTM+{tŨ,_IY3(>ZMve6,.;5D* <չeB7;K%{w*(qe!Av# 8.-[O WOF 9VFGF T FT++Xk\ ccY$稝mymL:pE"d`yw; - юo)Q.9֘((hYFZ b|b9U&5FӪfk/T`w7]X!83PFT luaCmIZZ]RO[}@lhf2&W|-%7Rs͗C{re G%eSꖄ&ℯU6E]% ϕEF8g6=c6)0UIkp uՕ*jd< XhG,|V,Qyc[j{wejr_Хtdz UYtIEa"Dch+FCBCY l"6X9[KdX`zi#4E۩ٸX]\*aԋCua]ڃGxX9;'#&Fs6HL3:D$ҖE2rp&RnP. 6:80>q 2uZ~^i?G> ܋e}(Qя^C; ѐ <$N]gA :c3=z4  Qkb-1x k}rZ~jw >Ǯ~Rڥ'S]t찔x8_WlE #(@Cs "TYcXjHfWj^,f0k6ubs,(  'ūU9] J#MnO~TG9" l ʓmR|9yTv-)rv_fjKhi`ߪ4\[IE#_żLPr! 3rLHj<|:zBψy->_m\k?̽x?T;yܽɸ&/wJ&ȒVxR͇Lǖ,ʲEDL&8oJΉ.Y ir 7±ˁ IOR&|"Y|;n֯7qF ~yR7ZFL=N]-P?X| 6ߟ~Me{ӛϓKWwZ·GoE>l8wWbA'Nؼ R ~ΰ-.0˟_Y^sPͅ[xf?Zuߗc>ϯy.A/V")0b@%otJ!xO_ h7@:зo.%4~k4+D@DYR+ʼn1W|5ՠ+EIaUS߾ܩs}4%M'R{xμ.9 cW˜ot6@.Gdwa0Y\T_AGAQc-߭~&okQtefO~,MiZ| !ToӰڀro&-2湿 XwTegHFzĚ rݮ޷lX s?.6|L>Xh;iL5 }̭sG5/*O>8yPgIUj]TD)pV{ ;U]޴M7oQڇ٫ar7d~/E$Qx>f{ئ $Qe茵H TPY(ZY}mVߦ-;M[unՉiR%b .ŋN%#e&%^h4:! )`D=f)5xlzdbhR؄23.ƩLHY0g>h_pPr\{=8զZ_ "Bx8˹\j~^<Z~Pr7q[`"S2J} Y1 _-,ai<8xP;p]T CW yHW6WpCv/Gۏ>RΊC Uob9%ӗ/rem9DA@սUXjut6*)ͧAeMкOW ]!\|+D+L Q6؞F [¼+K+De QJҕ>o?BxCWWx "ZŻNWR~[6=!6!t .Ze\ڡ[tE[jצ‚]`7tp5Zw#+&yPgt)eBv%窧#+΅Sf-mp5nR܏aQm%U` J>$A~1IB ϣ 8jGJ?Uc?h؟?^<{|bS-ia 'ihaKg||CZ 8U,ͷQ&L; sRʽ1 .Wqn"J){MB)!#BByCWWZ_ v]QZҕRL2]!`τެ ZKNWJҕ7>YWsqоUt(Mo]#]ei 3 J ]!Z#NWҕՔI]!`3p9DArt۲ـh.V;4]l]C{߃:jRnkAW]+ݧ+,.Wt(iceBBWVt=]!]q.m#0s; 5z t}S-m>YX#\}in!4()iip*jg=]T~u %B}Q?vΗcզs t  k`;@sWPYW]vmzF#BzCWGeCi Qvd8' K ]!\E}+DU Q1WDuswTȔ(jHQ-ml-ޕk-|ː0>$ Z5rF-s xG0L{c #\n}1qۡTD1’A k ]!\#|+@+:]!JJ{:BRh]!`up fruBBtut)3'B <'B˸kGt te4ܧe\l7tp7 d}1ҕFzu? UB:>J;?]-^ #°U+Զ^lVhJ1gP+ծMOzDWX o lVPuBK/z:£sǺBBWڮTtutř8K8p_tpa+DLOWHW\S: zHP֚;ZȦ ZZL43-%R[h|U+7[lO;#}ZPo^K-"1s|aL*"\A|+D{͜PK?FR\QugRRo 2uBtuteܧl7tp-C}Qҕ]`#EWyc]!Z+D)c++$>9Jo b|+D:?w(vAKOWw+e2t 0燞jW8 ZsPڎ]t%{ڵ$R K ]!\%|+D{ڡ}R=]=]185+DiuOWGHWѝn CJCjj+ˌT0vArKɄ|x}4Uڰ44w W.*LJ_%AS44jUZdaooZ M+h ꧠ,EcXպ=a_D~JYH>Ԃޜ|} Gad9J.R)4 ]Pf4$Qꒄ1,%_J9'͡NWA"ZMB^O_a ?6/rGt]B/l^qEo,)giD)O: 5Ckx=_UZ>5}MʸR2U!5mJR2лUFBS$&QN` Kcae$%"c* i%3+;]Od~?DLv.rNɌbbCRIǖ (HZQ4bY bipP/]Z$j EޚM<\BT]Zp7`r2Aϋ[u[ tnWQ#ny0oW+ݳ~t҉BMx4Y'{3x6ʣx|8]}jůn~,Wl@KON牋T֌NlB.p„Y流V ӋUjـ*3L (]}}F Tg1.2u"S#V6ooNVMoymЈ+5/<7,qD{aPsI4 gN?E!!x/."oDKRj }X,f9jOA?p`\<$GAqޔGEsyg|hr^Un|JXtY&x()p xM54CAo.|W栗W_^qpg}9MD4⥵h!XN gϛRy%l>r9 Tdz4q}6&XS7`ZLS.x-LeY"\d&%*!&2b//oVݿ9]j)$9{`V8|?V /C8&Pՙa"L)fL&kE25an]/͂pP\ ݆M= }Iq3`sh= T+C+02UIbUJYD)~ w۽ྟR𣉥*51fKqsm4$˩b* G(+yMa+f,"Y5LYBA5N4axim bd9m'{g{e?+_?@7+u~BCm@H_ٻ8n$ ‴WH!dm bEkΣHw{zXgK3;@d_f*_O/Q_NhU "@*#Ts0 tg__;-舒@T2D d, ߲K#, Cju|I}VާO=+-=1"?r8Q0.l GmZM1.7OOi66Qiz<u7&:03]`b.dN;>M #໸i"h訐}KJ38FST`)׶2",:y+5廃z((BW1%L 1Z\eeSp{N=`lfP$:Fw%M{z]_5.ݸujABdKMͨzAF,0lvIkwިJ= ݷHodԢqrtӉV΅KOME54DdUqNJ߯L']9bwV=HobJĬ- IZvh, 2d 9؇@$Y3::L .5ͤ##p˒Ȧp+qfmMwin~]ұK^8ujAJ1@ J C*4S.0y 1"٨SBR.BX]h.gK0&L*H+.)KVIo] ,JP,"""H5imR QQUg2xZ¢|:f-ÀN% *ZCcV)Xomp 1{Ѧ$|#8H= MKs2we2sS/~JP nJ[G@<1w_ށ&Gٲ8w3[_Gfwo^I?3?߲fc=sSMEX9hK{+ӣɼYF/4mh G@LQjD&!c }.HdgD$eK0P\`HXl@ވHBgF 0w 4TZCHʢj}1S1H2JKEňBTǐ3qnDW.A"x$J7x3yrj*7`ы#@%tvkdz%#f.[C^ȦTl$2ٟ4e_rWKS&SK!c}wRR1dBQ|a)(98i{ d \лTR<<3VQ IYS=7ę+=Ce5֋\ۗࣇ[辺l|V>jr!ws2 cjfjOt1D$1lDdlt*']_Z\*ҠNZU2GP^QD\+#xIC+1t$Y=n>vٖbڶ~>65?_qVRRٖnkX$Ę9TJEB&M] S^ЮX<}Jhrൈ,s]mrlˊXG(ޤ2Ĺ]רkt:% jLY7n57>P{|ԍS.f/Zut6Fcž;-}캥U9PVm@E,17m\)"DR0e" =a&>J d;Aj(Z8( !g,Ja B:!'9jc; 5vrᔝνhU$)2GI> Xd:0jJ䲲%Jα+/u>v*ՐВB]A] 8o y6Y#("P5=*  ~)}1-"{n\E+EIDJ*Bxa>XE,\ +ÿ9z.>ixpd{n3d2^EPyH&;/}R&kԭ"=GY)eHY@$[F-EQfX2Xv$ui4^OȣHs3"9CX:Zg'! B.[gZZ;}gr}lc25cvMFaAXޯ ~Qgiԫ F Gy¼,y8Ll2jgV͞t ԔPRTR0q}ޛ7s |!խuF$zZm/}NB<CG<8 ih?_7ϏO.}k`tӿx1ҺM'}F7S.Z?o3oߟΏ\f,Zp\FGg/$GyKߺkG4~m-Mͨ͠šU6x'ڋՋOƣϫ9<oz9VW/MBH>46 <|< &ePƾܓ"C8-jϧ|?? g [YsKq°:yW &p6HGRWϏuᅚn.c{H|X@lu]_uoq⹻֥\iK) !1s'Vƥi矻pds )=8%";Y,(L(J:V>zSȐ,TWR!3;_Kץ<~vԂi휧YV7h!ON*MdkQ4fK yp-e[| y\LlNS1NEGi!T&L$r0u5V6' 5zdRL;pƂe%;}s;[W'0%Zv3)@F_G"YJ`$.kE&D4JXUp,*g9T@+ }ݢ/!H))*K8 tIغT 6KFLQl"-ztz5U@DUdmN5'Ut "WĘT9,!(Yb:ȣ=֫~nꘈj1!J%\fLA#fQqde(>\z!}^z^ Ӂ[Jvۨ3TXk1d uMQvbIΩ г|%Ǯ,x5:xX_T֢b!T4zaU6Rm RY7ªX;pa+BeWq4/@gUإiCUti_0{wZ+|g:L??N`+\_&e2;EO?ߋN^ww+=/;6,{ ~Yj~3N? ǫVjuvpҞ{la|_~,ؐN;|sw^LFL{yoqZԓlXw,}#>6:OG|zObpQ/,gb:0" ƪena 3snn0߃) sW+vZ(JY@>PTI N#Q P.:r4RG&1zi IUDΕ.U,CŨlS坉sM.QHA b |̢.eBFe!;auABXKZ`؎k^Ҳ;[LĀTsF9" ڕ8+*}Ū8i|ޕmJ`Oimm"+k<&mpwm|-Od(\Wn4F0 X8 U^mqVd$$bH-Yblº.R BVBǨ#?%{=rE5iaH][=dR|h($A$tw骞L\*tA1TBߎh0Q|L+tflPX1hY`crBa,K&JPSkt[E6ST҂sT4ʇ @*CFː = &D%L{cɹu!A ꨬm+~\"0bc-#Q2i}8'SL)E 4ِЋC;.-k4%D*1S<(RרQRţ=i& Ϝoz_cpV3⧳Ce\٘lˋ0/Bˋ-/%SB74 OpާL)9j Ŗw‡IǶ|-?4&Qa5 e5GU|?qQEF\`me=Xj86(Ҫ`D:%#jl*djw<ભdl;V%Nvܮ-T!yY\XQ\:CNJCh#j  ?DdVD%i|5)RXߩ9wwW<^ ݝ*b~ԥMnFaIySy -Op$ rH _IH9I2N!r--1İbz&:3BIxʀҞm$@GՄ*`44[/*! 40<1CM@*'g,CsT9 "YB z014g51tLW 0xϒ<_+e7-GC6V(jڳ;wⅹ7ِg/xE~4.L ,uyK=rw t6Cr@bL4MhZc4?t4+;7EDž>aLQG.]Su|2iM/+cY鲼n,R_>pStwŋp|,:?ewܕK]ap|8'dՋQ*Q`nՋ/F9֘N?y_C?i/dzM7^gr o(X/K{Cnˤ]JWnM.;;un?ⷫt_~,VD]zy5yלpP?mÛo⏋?^v|x9[B9S^wnnu'Nd!r?tS_^<FՉ8gF|aƇeG`TĨI#ȍ2~jt-^TLze=*ru%ep>Dvx en<Bm#1i!cp$5Ͽ]Ea~k~\:!?.٫^"-ZAN49}n>Egjwi}5g;P;jgxO˺Ϝfw- Uc=C7H,bSaxMXOy' m0=Eń#6fzf2&Vڵm׎Ϝ|YjxklNb"D@PcLD&0P)FT v ,Їv>`i4RLL+-Qx,(FysLˣ#^e C9뫃ޔll5m_ִA7mӻ7mroדڽ{ګ ծ `ؿyVech)\ژRJJidD(nih}U8/w-A?a#*sF+m0FLk^"W'duCQ] SK8y)? kRGHĔ)9[kv5MKgY*X$Yi0nAmҮs4a/(%&2 54["DNHh CLڋFrho%׷gq]wB.^B,v!^gO8'X-ίgeC??ާQf2,Š#jY~+<9_d_Νoftb.NDz8c>֞[SoO^lkvf ʳGˏ5`_Q캐ݫzhy>zxV'^ggNkg%sCn&;1KBLS-J84 )'72J1ȅрښ"uI{ST)K,r ipcpVkB;;~7LK6?O~=hUuBlֳܓg)9SϲNJ>wKGWbTa/xÓqnbxvu®8C1;J&,{3Ah7/ZE;J!qs*edpDRqBHL\_*ge{@O4袭-y_nK32s:E*5h`Jg$Iz IjK k:_ͰLB-}Ўlc89oI1\a%KzB M2ur q EY@wI 'PK A V)ÜuI0NQBj=+e rHi?űzYgS` ,PZVM B6VhcD eh1(q<B\.)P5pR9&Ů.6d߻;Kd)%R% %K2ВH$cP'Q8 q YZ!H@eaLIҌ6f/ wZt /<$%R%{@X!YdOb^aIiP(8Fu*lDڀ84AeԹC CPi Jj|āFnCJZ%oBIlOp1eКPD)D"V bĖ}iqxձ>7hm?[ 8r,i D)/Rys)SEKB`IP9q*m$I$ u굂{@XMa5ZSKj8(DH/{\ǍdvăttUe!1~N^I4cWf%݂+-DWWb*Ѥbѫqݛ.(h=SX*f0WGon,$/U*!zqPBy[Yֻ*Lb"ۣ4K#FjE}lҒƈ)NS.ԙ)Dο [Y<.VKM2&''cMkd- !0!ce.焱 ;x3S{û,y'|tacGK;] YR0K9`"qKv:*YT#,]+/xu*3l`>,>I]/6_@(FV<$QLi0Q!zJ02b)倬 /5W]eMxn}. cĢ:Yة׏Yߙ%72*A.doPF$}_2no߼?q0Oa TMKi2} _[{`#KAq.m,5k>^lZ@mĖ.W5-h*`[_xA]K Q~F,svY+m˱T]}+ƈw;PjbpĒyS2j :K%F̓/$L("(k@7emΦ酬iFJkh2~؃a:@P (ȩ ?gĢpiSR)cEwâU2b\N=5[u LψBi; 42kE] PJbYu3QED\ k ߦ%g  |m>y+u9FZ qejh.^>gp>Fn5t٩HFW5}p-Y9m!3@ Wt|H5O%w6xy~5ťd ȃ7I3c]kF _k`(btnK(eg|z`eq\ȜVxJaU2:N֞ww>Qa,eJ%f@ɗFNæ}RBҔE"rjav&nR޴!T$ 2Pb'ܸ F $akbC9CHxwS Br^`j?.~%v߿{7 .i'PҜRҼW]㷍 Bq?<0pO@C|*Ç{:{[ _o8;<{E"{oLI n$8Mju*dI@LYLr$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I M6 dCG( ԧI\?MH0Q0;d4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@:p~$8Ors& 0@S@L&hH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 t$@Y0:Dp5 &N\Hy\fi(+O8|x$soЕOL}OB `_o ߲5? eOX'Z =λ'|H3!82Oi9j{ǴtN1}DL'ً{esnr?}{Y8CGp/\uߛ;tT|/vq].??{kF#Ϟ]E1h6N,d"޽ +^VIC?}5A>󧛇^ aAcdA |Ͽ|'dc^{_OTr8җ қ#/jYrQ[ȹG;7#ө ,/`t;BqPTWLv\7 DJTꈸ \DuM+Yp{\JV\Wf4S1/zOJ>q;Ǖ /EeǸGU&p vm+h\Z&9Q*D1N+f7DJ#W\g6}zeEZ'8_;JO^ ֩W,N)iKʻD`4fn{Ƞ&B8 8TΖ \e՗6dwvp@:W?`J&w\WĕrD4)̂+'1ǽJT&uWGĕi𥸂h\k*cJTm]qmpE铫nMuqdtԊDx3MV]WzXǃg35Okfc<42'F)fJ8 DnWPK:bt'}Vqu\g^ôJp;yi\kw';W^3=)Esaړo9 "7Y0-j`Q>$wv+H8O''V6Oksr*Q3=qEE,j}.*ڿ#?LL87 D.Yp%j;De *!&•NJiB-[;DS\W3= `rf\Yp%jiŠ䬸: < YW"w\Z{Ǖ ꮎI|\Ap6J:? D-~JT&>+k]̕׮֩u*igO \Y՗6 X-8ip%rcW6 *!qBʁ'• |j+\\Yp%j;DeR\W 5֧Wwb:juGyDi"Xf7 }tӢi|HL'OO֔vwڋp_\W6ztIo䨍O*V[啟Zbbju'hqܺ}$[xɪwJ~6oGv 6/s3֎,Jg&[ՒeylnOՌ<~UERUN;]vZ-/7_5Xjd/UP% ]J3*ULiP ] ]\j+BRxJ3-%-=-ln(8xteU+|ڻWWҍl,`h`B5tEh:]ʡ-!>՛Еgjw ";i{#]NWM/Nht 0C/! .?0]uC fGt%Fzm8S]/ .g"J #]!]qJ k.W>]JG:B\J-*+.F]ZNWRQҕvgRfGq:\S,3R;;C-\ײIz)%xĚue8 Z:5F-/"ɍ5E@5RZE #Zs%Ļ~){$/+lh$VBWV%F:B҂l "BW ^]J'F:B2BNZ:3txurteq&"UCWVXVOf#+'5 "\D-tEh:]J9]qJ)᷿áCvC+z7j`k)t%GzmӃTz`"\BWV|#+.^;ƫ+ "NWRtut%0;4mdNsvT!uVh &i &r4M(ii d?:@3 д3;)4m@ԩND?CFrweDmCiMUk B&Ӫ2Z[LO<"X !` I8bn0&(6/1BxlߖO}5ֶi 7"[-i{duq//k>7P9^f^ %Xk̓oOmm0)ɋ!Z#P*Jc W5K߀_+몡+D+%#]!]-BVDWH&[Z7tB֜դV5"G `c1ҕؚ [Q"jP6 ^2le ڻ5^ZwN(ЂAՁHWmz+T"BWV1=t"CjM',"JBWPZwCG:R&oo ~9ti;1C-ת i+l-4Mh5f ݷb[u@Lp̺2;#z2zleցuQضv}H/\":Yڙ`7T̔xq[o;W$_%ʨ&[,o|RNg`P,]{]]I"?U lBX_* 8O]N=u_l𙢑\B?uޞ§^:1cOmYyv&nKI3??;KDL$JW0x(<#PE)sȂyͶ]O_&図yjn)k,/*{ƫ)z4_ȋ4=^OtgWDݱ;O?s[;{Q/ih+`1β1X<_tSrߟ9|mq(Q>nAeߩ#*嶈!j+gs*&Os-I$9d\)O ReA+sֈ}ʭ{^\NfVs eZ_[&bQEzA2M-ˇ|2S/N>\  s:=97{}Mn?vG!,!ێ=d%{wgRI}O^YĒ`nq1Ow'U qͽ;5q6=b}^ÏR@ROI{rfB /% ً& +;F% Oey}qS:]f \\Gk9 1bbP2y礵`2c V:()I K{܁KH\+}aL@Sx >zrb=QM|Y"O4TReh ny^|cuK7՗Y`dCZLn6 Vl=? +2_.7} #@E+S(+Jo'oU\ѕȠDfWT yYHoS[JܖUX{nwΆbOgC)b,Yj?\1iige6C@iR,Lh;t)ztKc@y\@iPAQ5CC ũ+6IE8FU2F XU}Z W-Ʊ.$l)3}Lə$Nm=0|M/NW[EEd+AN'˴c ( cĒs41^GM%33A*V 诔=3 ) JX+A1ƞ78;j֟h(K _՗i|,M^*{,iYU LݓHlt5ϚoTL6jg ϋܜ{$ /oM{ЇNYdtF)HL ^(QXX&锲J wr5%"TpiIwHorҠPSJH癱?fB3 *nQVפ$ԬOiM,.vi7_M ??L'o?cۂ #I , Tc)9j3Jj@w! ^[XQ0ę N)8h wuK%:_G 泴b{G[Db!s dƴ d$/3!h=##i.d!^y/8/'q8א:{ȋ#/>e)֠,ƇZHϰ.ÆRGt}D 2#/~ ^ fC3CW{V\˶|v45Aie?Zeh T=[=\]i։r!fiGS'kofNun]W=hąW^6XHUO6 O2hm>m>Ip1NnI7Dy!.]T6rcd0 jTB{ kW@qטvJCBg[N ,V==>=bAoG/eӥ=!Kb\;ŔgiTeRx #-iqdd6h0ƌGbG.:9JpѠ3TQ3T)QR)zz*GxϞb[~/{wT|k3vk,,˯wgr;Zrc+%p *%D@iM"EFu0lT>D Jh !렌BPQ9J^-!1@X{^89γ8A7l-QBėe az^(.Muk.CdTd$YIV掲Pwό[I#W\x/&R UV|91P(3ݻH >1@,Uxf %.<6*B`9UL\γi%OGQR[7#q8SPDdF亊$<ZĵdQ'3p6 Y]9+fĜ/[+;* RĬ\~00Oщ tIUa{łB$` 0AS0)$DkhA iTVgVvN~aa4cŻq  X,܆q^g8UOݏI^oT3 jcP{ e(ո 91PTpmHLJ<.Ձں l197zov"F|0hex!hS# ;OE%1B~ cT<3 ٖ:tԡN$ `j2qJa'm@!䞂Mtl҄ge/: Eq>hK-&V Т4³2 )9KPj(PI'e+X(j9Uh1Hah"'^!-7 :I*Zb|8EaRk(c$rqqs.AE`ji)Z.8fs}r6-)KZtbDL iѴxړd$OJ2 )xi~F9ypP)jc*!Jy(M\:tɨ 3pnK A8>Kd8fM"Q $wT88.e !_-njy2%Nd^ @=F5K )(ᘦhzZu$3%ea4Vp!y5=spݓcNPKx玹vSbϝ p{aҦ`)qrZed+?F˺q~m)9TK.D}1PR+nJx#"?Mǵ^mVi]U@s<;>K,ÿ6*b"A46jR궙n&&-`Ioe2=k֏PS5eQ%BJA8*E ~_>x?~,827+\vK]pܞV!E^_+rsZP}[f h+]k=hZiB~>WytN(EH)M׌EG?S;N:j},WԭRtd HZ҉ ҁxnK~jdbo<.>罥BehuiC-m~O&]4)aZfJu`S("%eWslu.vu*{X_h#Tx9.nJv8`GKeI@.-L%ʬbN0p5@s[!-/v&dߎMz=UA*n ;Z]q{.:DL)e@p+]]AEo3 <|{Si?,ڃ sO&Vre?("m(<.N ЄΘR"Kƕ^ )Zf`]s+dt2wu4FY <)5䔈&BGMP)sr@΢!9[ x#+Э++48 ٯۏԙf6jvRxЎK;^hNbr)H T~0Ǽ3xP|N*<|(xC.uRU~Xͫ7hh7}[l<(|^ԧ3d8"!++'Gbcq1砎?/F1zkd7zivø|ׁt"XU2cx)w$O,XpkYpk[pYp$BLONhG84C0ҁ$YNQF8 :vjq))\H>k6 HJ S^&$oFrނ΅ɂ۲)xS}x}t p;?'f!xRSTpaw).#'*W(? Đ_X Uiwe"Ϋ蔧~ҟNX\&NDJĈw HPøy[.snz޴%SgKJnBVY "¤<4z4:OZD4f ܱ &2":Üo^.޹AĽ|EɭnqBڪhg~E=^I>GԠ)]2QB9aG#rt甇#7_MϏo](N ^`=u!29UD\K/pz 98iRv/.B,Ї?u.Hբڶ'''ѕXRL ir6i11ʏ>خ&6PNCK5*ό 2uQ7Ϯ5rQVyVx4k MtQA+SԯC"ӧO~|4|_<;})e'=B9ܗ)?m]~C38^;47ben|qmS^2Oǥ^u8 1m>r:xIǪe7 7?-IUh$ZgwmqHW ~Loeom˒VVfU%$+[Ui%`yHF8A2t;jZ*ABk-LZr/=pI.{yHz)9ФG. Yc ((1yHjqAZBjˡӬTl+)u֝;dղmg?v4&~4˰ᛑ'SS2eHC3FhDFshy&ۿ/,/b#.j[hrc6ѹDVq1Iu"9I;3R/`8 !]. \HRPFw&.Z&}_]WgGyvUT M!WZ"0鮋֪$Wpj\HqIYv@>k}u5 xd2ʊn+Vn8?;PP딆ƕ2gds45DiuT%b%E'hRCt!zo 좡z<#tT{T&g'THOv*OqftJSr}1_ ܇%`X$$#o/QNQ7uwv8;'up<3vOU }_p/([n>~)b 1}@'**ɗ8pMcU T #M*>qs|Zdnp(m/OG7F rr+XZl{w2mOS*$*FCUܴ)xUdt37}OiKATªo@P-I5#\"7kFK1OEN~'R]])]ܽcGSwoFx{6t2R䗡9Gva֮ӭNo,Day˒)w 켚QfSKW(w ?/ V9Ulx6tZ ]u^:zt0]{O$T?ӓaߐ\^pN.u᭮nʿZǓvvIvӦL"@:$$CVJCn[x>wl*c4Ҏ:o<;HVhsi/V5՛Ww~. |NGTܯD'+sZ.?Z[ѫZ&2>=k~[__O_UܫoFߩx')vPV?緫)~!h\T4+z;v<ԋ.wmm*_ۺ'=g4|pskr(Gu":Z.:qbkkCo_+Eq3]=3ǡ zʃ+vmAWv=)qgDWfCWЊ;t(BW/X93OW총 ]uBWCBW/tUjFtF'pBWUGBW/l ɭ~g{HؿnUlMyyvgzÙRo^wfu~z718h R-[!+{}U8:[C<)IuS75 dos'u$zBhMiYLޕ>^|dgnV\'-|<)[F"MY0q7?l޲3&~ϯLMA C2) IUQe3që|вiGH&Xph?tQ({$.]u~>Yzf:JDeFt͆:NW@-t1 3+f%f vl:JDlH)|6.٨s|(wz9tS; 'gC\KꃧҨrzZi&흮X=|}n8zq(w6']GЕ[Cׄ]ߎz\І=|(,t銵s銵S{_|\?u C) +*?dU\誣5 vN-tJ,j?E5){7v#yߎ/.C'VWj{oHyG$I=^Ɋ M Z4H vd#>gdEed?Ȏ9 唆ƕѪ M ,UJ. ef..]1B:P׍g?_U"Khy<>\`srϊ߯vufQbף29|tOon{5փǪp鱊QkgQna4o~op7{je>\jӚUx9RW_Quz.9jG\t܌O?S5qu(L)5 F{ 8} 2:ּk94Yٚ0|m\l+-!ix^>՟ $$WD4!4PJ`#df` "K۝B:o3V7_&M23u&c_m^b]B1k44_#(*KN.NDH L~ a yC>ώ2z ^mzx3u&hMOkT^i:Yu rAh2@n]<"Y78$_2)X va =%EHiDH2ӊ r!}pZ|b#]+c4-G0ArѡGHVOQkusx , S`չI,TGgUg5VwFUaVd%gA^VHlr0Ovn}#󈓹hO|;X>l#ByFF$S0m /&s 7DR. ˅@"8[451I'Q3mXW!dľ\;x@6Ч5EjEʶwߪᩃ,-rU1p_I>A`eP|wpo xDh+YCN0",0p# ^8Xe3ֺgkƚv7? r!e9 4YTxf)ͫTYU-z/z"[+X:uw IAPH>JNr\%T \ƥ Zi̝wWrNaS>QNqY\I%h00Ե&`骅 6`&ShݴKQ2 5$gy.j KV l4h7cDl{ohVfAx0/a,T09 9C ZikptR*B0e B #8 :P#}P`)WAX ^_:IBYa]r\?0)NÒ& Fyne` /TxB!E | BJG3⓪W=k@ׁڥ-q ?ѴaQ=+NF1VF372RH >VXpkK>9. ˃`LT*^# Քq@],޲FrX݃r5W-D x֠ TfrG e-]XN'X0e%3:652<bi7,4プ N}O&3ҁO K C@gI%@ 0 /a<@AႧvavNNl!& *Vn:"RL:FfAJ&BGAȰ]u%^r 0[@$XA.F銄Xg7L00vꅁЯuōl Ɔ)$k\[?16^{5$SAEv Ft*h2ЛW`?a.pv~|:_|Ǡz2f0$; 'gMMtu鰞oҙg`6t%/GPu^fa4:uzm u!P%sPo>PPj}A` L\nR+< R*I$S$48يH ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ". ޷6D d{s=&J+Γ@(')@`2M$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@OF k?$kސ@ 5;O I gD D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D tI όrD ~I R3" CH ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H " t5^[ImŻ޳jnfp"PTJ{ir6G̙p f_%V]PJEW= pRf'u{&F]PWW(ҤRODQW~_J+쮫+RZRWOQ]ImZ]&R?x'tl~gO&yes|6?ռOJ97%(k $X`9 u=^~}>&8~׿?jC-'ϟEgO 9zƸ+%)xUWU$AP1BJ%/&Fݫ4N7drjA}UonovR3Pwe}?aC&|LTۋsDz.Ԛ;alp;N~bi bSYSB20XܵU9AL7DL؝g8/+E+/+Mσ G-G&ZPKmz6`c~NӾdtv\~Ų71~L{. ]d:k6Bs,C[4x0st6^6TBJZ2oNM?#6Xaھ0B]+z:e M5mڥ6j4GDقisQ&(V$%D1̷2̷673XMPTAXC*Vu(^%KLfX xS C3qn7.k]2LUFf-l)76:_5>t"H#>//Ӊ='}dp i\,D|}ѳ$x}:efF/L'>]-ǒz @̞ǒe}U- P5ATdͥ91C^eZ`1`EҌ;C SԀ"P\gn]Ԧ;cw(05@O'gi6&OoWzy¹[.1zKosuzjNfsW9}ŊyQ߻0[c qp_:=NnO7r.?@? QP#q Șy1`J"E4eOljFZ2ra>I#`pdֺpt3k7gbHПo?ʬN ^1-QJyXK m$ec1_8C_~(N>5^o6[Z?ȅx/ Gy!z*}h8L2pYWdK&N{䈹La:-qt~0 <>(#,[ +AIL# cB,d&f]^\DWsha:\*<,ɼ[/KhOTC9VO|͓?;>pS۟.A˷_Hnܪb`8壜ͨy}׻Pӽ㸵l~sŋ⃗5hg%q<Ѽe_Ho<z6ky oմ2~VV[KR#cb6'~VU|2/z.sxvB 140MW؋ IT5ӓQY|^TQDXnժIΉ*hJir>TCW̴i8;ꝍm1nE˼_! ߢ"w%htR_,}|K{ jro=*)l[Kާ29\?ϥ=|<.sɧLcgWFyOa8`6nKb低Yo0Ө̵Հ0uKeB>R2Ǥr:y'خ߫ž9d CU .!p XVRhw.F6zUjJIz, C^[ruCiiۿu";jOrGochl1BaXύΫdbJz{s +\*50Oǝ5sxqlTyԜ¸4V +]0$LϋjLмA/L5UF%R,%H; QmFst93EjmsfhP^cvQT8p/tPC3qrCj`|C޲||aJ˫*_fKjlqWA^LJ,,Jw˪9I̭Ҥ-eH09XV I 9\5(mnFOWM6YǕd?\@akM:r߯1R,4I#yآUBҺ:'yW/E:tJR7Z-g7%{ELimv_m7ej6+.ײ][o9g]\[Du Ix)bvzKGG = &j dxRȕ'a% [ʮ <2ɬ5_B& %f m>kE6 lS-q3*Zf1.9\KDq:*E͛μꌊ7Un0D}EI=D8h$c#c=؎<ewRC I2)LXAì$;u5 l1 K4hx7]!Ww]\Yrm5Y& T*$3Βpf l!jB [dLeܙȹʑ3atүW<^# lCvKoy ]=&.| JC Pat.zp%պr1ɲ$$18E` s.H!cUY*αo,I\hBZ4aQwx:kL6? J 3YchjT䴠  oZ ʯԠ3(] J׏mZe(ML SMS]Lpm. hA EZĀht&sՑST\FB=W fJ%v^C&]QQh5rv4(>vzuu7B}@7)tG9x2!9fM'wQ i˽+dy9d0C rƨ]4|@[A0}b$e{K@^Cفai'ԁD(OĤS.&Y6ϭ.ƿqM.V5M'=`)g{|Rn9:ˋsK9EM^xMϊ_4軔C1^OG^Nh/˖W9X^/0ӣ|c.eF816Iw܃fy+Og%E;Գ9jIO?~AS^NEy^HZ_՗_~9'0X8p|Jewnv@/`2}s8-_~qs9[|2=]_G??$Iu].6?+?I '֜ꖔJΔgM7gO/BNND?r5$gu\{VJQY^/Go -_ӧ֦ny&$mƘg:9=Y$cԟV3&F½U+6eޕ;MN>?'/qcVܘ45)(nP2 1IδF7Ej\ݟ}NtMk_vqf44<'ιG3du8\jA.S:Ԝ 53#E-5uZSٖ1/']2i A][& D[Hlv݄,hEc1/Zn&t_żSwgmQcoe;wh " .6 8XHy֊;ڢQ!uQKmmNaVg>K,U:|O[L°`'ieB,с%pW:AbdЫz!if8?=Mx-7\Tͯھd(X\ W2qe jɴ=rp0v[tÓgUi 04TѥFknw/oD߈ZR cE~Ba(+2 vɾ@*6@Exo`^І҅( # * ^[wxR?oԶF@Сs=X\j~*OmR,_ԲușAOdz>/ŏN4'e]ՁYWu#tuz4:N>.OG:_O>՗4SKLYpMHi%$%Q5K0]4G3i$n/ɂ|M,}oIOehٜ;Ԇ#{Nj$@' 63Cc-KRवl%qS 699a6Z0m8}D=c^+Py2ӢVQc%Wj٭F5tC{QB˵֗Lo8+ACk aF^uTkWdS,1[9]è%^dirb!XK+7HhD@ZCb)?"2l j";=c,z"F%6y4`ITjǃ4I+G#G?Β9%RK\09H JD{y"\ɾ'j5j#m -aF[[Ԓ$JU(2MZʖfz;"DF/7oH߄o:3A2=xJKb" R(HzKHrdΉ{AY \a4T!`,0Y$>~F25ZoQFe0*4kKg4I+eLElШʉ(gA}WN6S> f莪H WJ)@'V儺*WQ=~d7$#9"7d 0E09 (`PmAyUZ饃K>H많);d^AN]N"x5y w OQFVI. E)HNai57Csf} 3LJGB06cm f"c@r\!EJe! O"$K =YS5N6glBJ5*C&i#O|k\I,Lh&A3j֯Ay:w47P&ztx.͜(Z{Q ж+q,)o6{+!0FnI>NFw*7N焐S:F?nGv87BselZ_pZXS9]SF]U:7[qߗQ%c;7"AO2&T>7(tl6+&6A(q?uK,+@ȝ- Uص͵A_ԨrwUz7[ݰ~7Eo|ws+7oܹ[f>]b{Ӥ4N0!`Uasa&;T밧zlJt}Sl͂z6f~yEkddj-ѨU\@`HHցJf&n"G w/Zo˼uɲo/P[C|h!iPy{i.?mzxaKK4Rq2Hr IE?I\{AY>!.@;0-C3w\X~ {;9H`i+;9H\u$:uumNRٯz'x@;BJ+Hy6ĵ"i]U)tWRj+h>]UiO][tb@Wo]$tg䮠V8tEsqWUڍ=uW$ڬ8㮔SЍ&TSw e#Oe4䭇iq^FZ-tJc飹Q:kA jw[jojB]U5⚳qWUZT}wW$07 U؞":wwER 1$ߢJrWU`<5Ws~.J+{jX}JIy7SuewpIr2yUK C=-w:i~05 ؠQiߺmJWItz=__/0bK#]yau _+ D0.ȭ(OAjCFcȺ$UG Q#Nn8:ӵ!hþ;Rmutwm@6}_Dj01n"NX@iMO:0 o60 ow<mN߆60 o60 o60Eh"Hu6HlJT ,80Y1/3F0߯޿Etw~E#eW3}Nm c[UdDƥMgc-OOR؈`mVQhUhSEf Z12%مq2+' Z+yWUz ?ڛll媐[*EDӴƛ7⶜s>bG^zիn|EuR\h6\4 AXH(U.5((*@L$!}2.mfa0 vM,x Aq4w$ *jY@.D#MG$MMJqArYZǁl]bV lј{8{YgWl嗰}L Y-K\&<ֱHgsVF8GL |^ӯFFGQfdz ![KzEI(e5W9.aa%5@fP+ތ8i{#+<}&DXAKX.w)zЗ\a7.@QFI)T88y~hV꾙N;~3M猄Ʋ8`.W K}5pL 2X3^S"ҚeDHY>_3aR.!(SSQi>$AF^GAK&zTJq̣y(Lkuط'$O-e}巾_/n݂e!alJM~Ij\]lh+:2HU U,)AD!ܰ|'݊y%˛؊b k+R EJ'XjX2`ZQ ΉR9[QLQjkYH(K *xt6)K,Mi$%9Tk|J=i;Fs79Nv*bK!) j(FK,#B3Yc&M99dyU΍ X(= 1hS(ؚ8H7¬V~OƒdYbEJd3$sN \|઒8 uTXf9N) t<0/h4^+3ZeZ2KV)9b1^ WK\v]t: G㰸nl ݟ}APp3Cbs{xjPҕP_;V ]GrEn+@``s>&s(`i 3+.H많,x2wL/]ŐEj֙ԇ-rINGuU>hCtuB3B@x7:)/d ⨲ߎa?1rkڀmqhv+!- !-ͬr^<Tźȴ)-Ж:+Uc!2f!g´B-,vhXtԢ1ug=A}9[czfSy) d/ڮ 漲8Rok= ^uV|ϳt\4ɾ4,}T'Z6aq@GsFe|jWq={${UJc{o^郹:͙3A*:$"0y`x<(7DpωKEp#8b/xn +9纠0gB1A/xtREiZ }sEP+JQJGɣ!b,2K_Kl5S6igٜ4zj*;B9#@JARfUO/IFf-yC $4Sȕ8zH,CٞL:Nt0(YNWc4VG%BdϚqXQaNAHy**_$X dK(̀@잇 \:"_QَlKG;wCeFjX|  zc1ļ_~ny]$3\fdɗ?Z얈u]YdŌׂ)ruD`K,b=u' {WnGEz@-(U"LK'sVkcM2Yp:y*1tQ;]ڒ}v+ӢbkOx>W4_lHRd3RL\NTYć(Rdt\{ !r,T!dJ-AtdMXXUĔ(x-K&t2 cpL&~7oyCT١IZ. iBP$ES%xE;~7gPazyeXҤs "Y nJ"R" 缵<wmI &`;81b ܞ Z"ʎwE=(QȢ1HuUu=ўsBL&TsKk/, v.yJX L+`ZKІ:>PD2V`FZ­ ؤS!SE]ͬe L1LH,H\!ONKM,x欌8%gܺl@l9=9*JX%KWA(DcCeMqzΐ*IrAP9sJ4 ɢ1D^rØkf>aLc.||^q̊m-,-ulSB]jw19¢#hv0{Ň7#7-?B1V33#zGqmUW Ÿ p"&-d<@V*jP&( 9ԊsÎ*8ӝjxC[?".HoԁhWkipVOq`G{f|Jbߌ]X8ϵ}j#WHݢlOۿrvgǓsRDh'l{bũ2Np8gbT~3&~q歋y6៵ΫƳ‹e b$ 9Ynfv8Rƣ_OfuS;SAȪX9WtjzVT#Fk|̳whxNtu6x0]9*#׏ھQW&\*,93ٔՠuomlhF֤16$uI MJJɔ.AL3jY' XS3*ٚN nr}dD?{sXջ<ێVq93ގ+w`/ bQ/<^r[#͑.E_‘G,A%ǜǩbP6z${m!Ea^m*Tt"`hs6'rrmQ!R"1DҀJhhB_g,d<v@gA-Lr( #sR[@z-Q,>BUA+. =lX$$RTLaK D`^ xRK !"k |vzTe$QM3o1D&%eRftr.Y%vhߒʍwN#'$kN`5yã ke 5+ӄh-Ȿw]KVA9á(.E3J(!P+gɰ,46jT/V&K(1IL8g>*$^px%;xb_7V,~>+^z q»Op|2ۻ͏fw2eSR*URP:D'b!X:FSU>Ч[JlQ*iH!Ε1HK\0d<%ڤ#7I₴I4hjhgO(!j{D-(K$ƜH]rҞϞ:#gMݵMlv#Ju/$MTꕧ ve'JATRo#'̩ Uǵ[іB !4U9AkMLvkW̨p8X ңX'Y:ȳTrwE?#ͦVz]_䬮Or~¸j?<y BKp94H:W 3mR)]yH"2:Ř'F:LL2&cEѤ$ R 52vFf*Et26B1  wdxEf WO~ӠtpP;s6T \ #˄*XE,X&EeT@";xº'Tg!+{.2½geJj안٨vĄaU HЦE]s3b(.ۂڝ{ Q!$/Jֱh(Q YA{-A58Tm*I'#8w Se"l T@'*őt.J9PYq>pq[ձ)pÃ{N괚qGqƍ\ ~| GA ߪq88EƝtTRH'KSQ2©Bٖ;=n]e[\(7FܗcvEQ~q:|;;|)+Ԛl&CSz+M.W!,H\ 1,!gG0>,Yb&o㛸u'[5M'=`ٿTWƴѵsK>E(^ӺܷųCQ)7!;Ʊїe]A~Ltt59.3gBk!~d0-iFLN&.|_%Gdٟ7kɝ Ҳ"molN$T7Mdzq᳻/l^\ɿ?~ˢ>v팧M-v-YW=9<:wɬ[ߡ^5P/'7_~BNųmv-~;V8CҜsΔ_`xziOP[ җ;ݢKJv;bJ5ٞ/Q~a4|pC YKS^M.pBjIyfU䗬3Ɯdt%zP<,fUT,ޘ[^ŕR ƣ /=3d|#fr3~Օ/.*CՋ۠G()6םMumw0/Z~XDžǕ^VSW'ϑR" j!?yyEgqK׹a5 ͣ{&a. VP4WZ;6U7yi&9\3I@6Fb[ݼoXtmz;1ohULL;f֘0k[Cϒ!хmmj;\sBxk'qQK@ $I2AJ/iRI S~e>KY烿lw>`m(XeS)GkEM7y:'LpV:Ӄ^'W`Heāw66gǁv(n=Vms6>h/ۓX퇊VE~u'k&f~wi}࠘[S1>SĪi[UON/  ~;F6.nLn Df~mgF%_*X |)S#eyn-90_'{Mxx YPx\x*7=myn&y1-?{wV#7hw^{RZ(wI1-d:o_d ak.u;t|8=*&'i1?SKԲ)ʟj6=ݭѧE9RZ`-E4tQRY-|2}V{*<ܺjJi(!9dDN$ṂeLRﬥ@Hx+h\Y|mrFjrg<#jP5L3.> 'dSVqS׹ٺ"ga,u&ȘAq_;1z(>^۬cuLS vYRu..4B{=/[/3Fz!p<aҬL rɡHĖ^z[&&sM5U&U-,:AʅIM&a_Su!/:PBKk pZ)CTQg)"tB%U$p%T?{ȍ"MdOyq@$o&8t؈-{-y2EVI,OFc-[3gFj~l"GΪ>n"x߯6tHKwirqŽ@~f]ǽ'Ĵ\8L~}{]Ugͮ~XXqۺ UUo鍻+u=W8}!oi[᪩p>W=|u9½RNjvRQ8~5Zt H{uIOg=dӦWF@x` 8+8{m4ƖDI]0o8*,Y9|| q>wwŗ=W7ЯKK=8{:PVAhPCjQׅ:Y Aw%]cGd%;j@=[7Gx ||sن3#|MWC8'RLԍG8d}d:R2 sGd,"2 |Q!NDQ)Q @dC@)Q+ 8> t?ՑtлЃ]شu}w1S\VkEIܵNwЊ^o$)N${}R Û4޷M [k}%.S+(cġ3LE? բqV{׸K1*oqEYrdY9y=H (ك4L.A֘AҺ {|݃ԪZjm  \\#d)b+V jʉpE\1bʗ+V]bVT\W [#F/S2림:{tw9] Nj5j~M~wHOFiHhH?:Ӌ%Ir,4)* TJN)wznK,eO,vW7&y;'VKv*cv|qpoby1z<0x̗9 D ,n//B7"A_Q.?#y7v~}MZܵ񺵿Qxf ZߋF)5z6 r} @jVoTkEjvAn*Ge躦 *o56[t,J r IB1.!%d[7%Ki  6\\KٯJ%qeѾU\ÊM1b*;jk  6+m)b/|\JW#ĕqTX-g; Fd#*Wު׍.Xj)9oQH;X%TJ^aG&X"WNTk*}^ޕjumS^Z4++k\)b/ >WW#8i/W$؁*W,W]ZU_q5B\)$aJSSA!SC[x j˴9Sc̘_yF *j})QӟOV>Elx٠V,M #Zz^uZU I!-%Zb^?H*E@XS X [1WC uA"Ya(K5;XqelZ"w\JY1J 'lI ~ ?X rjB]λ3b*p* xĝ) ;_^b2_ k霳 j(-,<)UI#w.f>(`cSD]a-13y L4\=Xef\V\PC  6b1˸fuB*Šʂ0$ۂNX+V|q@bzP Y՚qYW#ĕAArNHY H-@sW2WYp"VG0ظ$W#] S G:KqJYL/ T\=ꥲʊpE{08LK}>rB芫 w$\\WUP X-q*uq bAb+k;XQy700Dh;o_L6 DY.ŔDjYKQk9acY˂/E߷V^VYQPHSL*\Kr!VšpEeAbX5"w\J[1w+ \ŬZ~ɖU!Z4d%[k])b>%[R}Gxp$$=0N7L~UJYDP A52ar}p5Ll0JU\=(uAb—+Vtb*F+`(W,bpEr_ܴպ+RW\W 52Jz"kJ3kr[9H{i LZcUjQ1=FL;>!,|U]\=^?ARG0>A|Z}tRTpJW\=REA"b1bK5 rTje5`A"F]\eJTZ#J \`P XSR{W2nԁID)2'vPjޚa΢g,v$,5= i(bβE0/݀NYM?-=eCw.k)7=ȏ26+?BugwxͿ޿O;{> )*z Z`#t\ק'|?WwYkK,_^\{C[_{O7i[w~Տr3 瞠ݹ8#C+B>2mD}u*36#!%^+w붿Fi6}{K9oo&BEr*qid/VD mpZ18}CM݈<$ln;!&W'!k~ւ6Sa;QM˴;Rwu^`!v;lUOh*pzY=5k3ts]s._BqjCzxͨa^AYyhOf˾G,_rsb;no=~y*O|#77e}YZPmJZ\Jm!9MD mh!rfw[s` vv*. /MtxACSև-,|Ʊת>:QI:jg[eAqCہz$y.Y"[]J|.)zu`;;=liW3[6ԫ_Pװ}L ڶKrSK5D'DgMu:mJ(lNgciJ_zNFG)Huʶ lBV ߢ 6zy`%Zs⢎8icُ6H'Ҹ D}VWmWW>~n '4딐1]'~BmJ]`]Dt|88iq{Ll,%čEf//e/ks?|&gT@7|?##U77)N'0z{ή~,npPeslqt* >ҁrwgЍ9O =[NEuJO]M9hq1߾ۇtsN>j/MV}!*:SZL>TeIg2Kq:\'i (`uᡂMzt Ei`i} &Z+蝓P{P- EyQXimTШXh+cdRWOB+k{b@L]4LIit d 1V&a*T 1M- cPE:FF]Hw >,}0X[zX3`L{F4ZY;7Wг87_dq1Jn@gq2LUΞHg=_*)ٝ}l^5?Ǹ| EpqݎYÙNCZbF^*-g@e]yDDKlBh0тX8,6b+C 㪢(xPwELrNrv;ߪd-:t}{mdǂM_L7ǟƓqAā=mv;Bm+c=*Ɗd{ kK$%q #J(ZIP\ CYy?mûwyy}¶LoaGbbgEZ/iA"'ښS׶ 1=O!J&S'xJhr]rvz9G_. )AC}CE/ڋ"_E7w[)|R94qɫ .Żs7H4dHn9}hUu8sa"N܏S\Q3'4bL L1ϵ1{GJ}!Ӹ5ZQkd!96U(3<1~sبw]RQ3˵ҶPcj@5HT 0٢ kӚ ۖ PzdN0BܩweZ~Pb(C+(~AT9hAwy+! ו#iRs‰au"(ix< n3aԴ`ua܍+^M )-AU$ Ka3*vP܍y`ZX+%DxE(oQEiMQk6% !+z,gM)& g]5~}.,]TQ ! e1Gx5Ku"z[eZ4ni1FԱ(7hiy{vWD>(5v'S~_/ę#d*U8' 5W'A"ɕvRtSﭧqsHE`Hd˹|1-i,aQl:+^DkEdF*UtӸkHIl]ǀjlBZ2!\ݚc@5чL.d:HcuHKX{N-F\NWRututeSM1nnLk ]!Zy{ ezEtJ^a%}qtU .{J2tU >SUR]zDW3B\K-T+97m+,$k ]!\BF;]JIXGW'HW\2-R|.կr)}|uUNL|Y|&<"$?#͋z"4rߑ>e>QjTְ[֊Aή`FV-GOӈO 3]2qj8']="M&y;0QY 7'EV96NRWZr` ֝J&dDu~8P4WJ]l Ɖx\܎\> ^(֛NӲaj]F6JC}Lad@~(CL0,ԉZ./kugR_HFŒ2J^uG)jo&wɺiT = bLm'`maD,RNFh,TN-:$310oQ Z6m Z_Z%]zA4_eIsti]\Ch[ [)]!Jѽ9E}yьpn ]!Zut(M/JX+lDk jޚ ʣ+DXT+i{ɶGQʎNZDWزI \ޚ!w#c+A]钪}bM~kKU5/\%:c;DW+վg+*[DWfuk rFBWVc+D)mGWHW؂!s̏#5&779&-A|@_|Nd8v' Ý25;xû|%5"}S۾ 8 HbH`dg^vCIF4>= ׾9Jf.7M WpDs_ߝ׫C`TߞSsߏyߒXKطZ ?22L/j8_J%"b6/sO7OV.y]lꎑ\ߑq2Y@Hkm9V4~+h8Oel1ɾ 2l̽)@;C2ya4?R(ɵ2fu'GoM߻nr?^|Ҝad"wm_V2[w!W{#3/ sB;OqA|MR/뮚8x6_@O[<>x' $j6,(G3~6[ +JiZЇOVhFvBLULg4a`gu4KWON.i/y4e(6_먀q޼j"lX2C4@;}A\W"dqMiĤ\Q`hlDx^- `I[A5)hDktU]Ns:\rŏS2UOg-d免/*ȭ85@ɓ1;p\`:(SH(J\IJbf>'x)9} #& ; a-r+s=[;'JG2s'qG;b[J5Ue+n^!, +!x/zcIY67ݗ2kbY3@4۫ZfBԪAgҫT"T \~OԒbNA.+^i]C&=!ޡ y)<;sjWNM^QsP$lGG[<}Od^֍$\Ac"%Fe4z'-PEϨ)~hn>i{ImRƒg*{qaTr1ݵ`lcI\+ Ѭf+%z7- JjSDͱ /җ* ӮK*m]?\BWVHut(%e] ])$Q-+,Hk5m+D~_Ԥ+MdmFU{+[ShGWF|t%T>4P)C08a2z|=/K%T<#:hU09F|B$0K9< ;yt4r%~?`r`6G=‡nF[> ~9(ir7&~t m}+?-.Yx +m4H*"N **F1Flp:rn<9[:㸥ǂ]?zK ls}>y,$1 LKÐq*p!! TyFDk"☥2C><h 97=} bx{^Q{{Lovz9 8rѧ hއj3x"]D:}+J8 puxGor(oaV6B݇ =xHgj~=wXD!%ؠv2zHcqv=`o؊>(6%dژݖIrv6$(6R@vZVÌM]YJ#tYC\RQҧ6^DIXMHAؔcGri+ Tvi14J:QL^t`TkruH&[EN&/X |^4%p +|XJh2Mne ~4OU+ՁrRX4B],[d5$Wot@A$ nG ?`tWE$@ȯdB[sMIئO4V=Bȝ,̆闷dw?׏&4I_F ?m `3"6Ht)>AjjϊngQQr,xg;謒R_?I)=>g6**íOH`|#OrSR_-#tOET3~#:״r]ә ꐏW˚;v ߇נ̽Y*g7s~A}`Jk\Ǒ 3\詋:ՐBCݲ)sUͨ[nDWLsJl[޼>!?76TaT_gʳG]\_XpF+G@v⇣~\%%%e$Dď)1$`"HtH %V1Ţ (!2$(Jn#4܋3ٲm:t-sIuot}p[C7We|*w5,&|p $3Ӓ_ͯ!d,KW(Ysz[OjzےC/)?.5,M)5lْSk3fGu”jKKޱ\[:VXJ*bQQ*D5vU[gOSQ̓SMB VB cR@LXIUZӄf/XJ5@VʙEY+P ^ZC6ڻ&Pzk"~Hjk0Dxh/h!ՠl=C7}薳!87_Dv>C6`r)qgJv"1to{L;{^ԁg<3џ^BcqXHix}Gd (כN=mI>.FPaTՙY`Vv/nIYa@`bQ|Pw)繒#f`K'M)& D%1X9z&g$V)UV(-E0fOR K(I8XRRTۏZF#Hb</PǻmN.55AW@18)!`0NK5gxXzu\/ q2+VDfu4H;0,ǫ@s4ﶒiLQNǧgJU"egPX1ɹcn2"tq--PnSZ1@܉EgS0~ce%[gΝ.KNUz;!4"p^O'SxX f&+4ΫQ&s`q$x؇H9YT` |tkgPڛOvהRCi-߮Cf{$*rXe1` be|YV7﹓Nq4"JUKk> /Q &d6٭(L _`!PIIZx)i"&~,qjA^ܛ:?"!U͙T̟̄SR!$j컐ΙR͝bgEf\@P`'cx-us/#ce>n|w#p`q!RUqM1 AHTάF%AT߆&jZYoq1_>* GzI2LIּ\!*'y3]Ԣxxt@0.'4eY}$@2D҇.Ad(fs`[2B1- 7+pܰi+^5WF {UarQIr3;)MwU]sUv(n5B6day„}1!= .%*(Kk@fǰW(u)sa(-fl5|)wo@ 4)eʫDaLE ( (5((% A"݁*f}뒃zcCK=v #[她[#d@IIt *C4`DJWRw~ ch!45fnYZ)tsf *d7qwkZIss'[SK֥֩!V XP +lQh4`϶=\e{~ēGwb:p_̋0|xc“R˗Z>TŴ(BiՔ 0(*(C) ( @$J@kTm42`o\挗€뤦nzە~+ۛBJQ2f?SNMGSbgh0Nb&XZBS1`us8(€uQB߽=㩻ӔɝRJCBNEVw~]hQ>œ)Lյ:⅕dV TQȂB[*qgv_MD$s:q /1:Yxb>+z͆1=(Z:%'E}^ρ+fjV[3ăec+_ok%EQ١^ꩣ_h3mwG=]//-Ȩxmi,$T%)ѥu5S l6kgB!tr{67@"AUhPX!l/{|̟gyaV@(fLg6]{NOtD,Vm2Xw/B DqȄihTa OjY|gQwLJ03VռٍLjl!$獽^Qzn<`KK릗1fUX-e8Cx8~a^k/:v1RQ5u4hzvqIuqG 1^jyR'#^wEx ,ࡊ@cl@]TPE]t8c9\KQY}k%kf7P1gmX 7x5qQ}!wq+`u{jTUh/t$k~*>k5-$̨]틊"#'[ ֈWvj8tTK*UQ9[nC멭~_ A|ݔ̗ "Ԁ;_L.~eA5`f}y\)WcoB(g/܎V~CsPO7<{wy%-Au"y*c :YE󷚪cϼbg%H3~2s{RnAt#t3rX2# 6?یk *INn>7dJRjPl'sJ҃=!#B׫-ҷG(edJ Z9FL˓8w32ˀGF/6yGs{8gG)^ dWݼG_8d@YBJ-IC_,'1@*B(t(+*:8W59+'9ExlTyvߓ xCFۜ]}cG?r/ڱǨ$-%K 'N| K@%X_Wն9k*pQat݁3uk;1cW>^ ^XѸz篽%={T用- FgqjG.0b&;,rH w?+8wlu}EfºqcS?4^nCӃJ^qSEߠ&G1"Yd}qC|"Oc!AW;RPFrQʎ7-#YLN9&%*齪[uZD0Dzi1L>}l6_Qaȿ\q/ xn!&r=/b8u.ܖ-{ҽ[$g Q15y{|QK&_g;]_;v|F?i<1i+ 7'/m]=aN;OZm5{ 8P,~كhEjPf̰Y'EGeRH({鎲41Kd+}/ҽEu-3Zg۳ߨ %|TWgWUN$g"yUlվ7ǶQŪ 4OHM.V;5-~W8qNTbSM ݄?bkYZ1Rv&ߩ6˚Xw媇ybooّ5c_zɬ{YEˏ\23I_nS#1YklޱǷL,̪PWnjS F- )m-*Ǣ6sn3C5\ty*ghHJ_0B`~8PNnrD?2E?$9wTdo{|wT)Q짚I\ѩ28$.Ig[MjkD!u'gD3΃z0T%n~+S4¼<վ]ڠ?*s{ІnaP05KԌfqάH<*Z!HpcS?!trYul}4U]TϯA"^[Gg97Nuir(;l4~s4m|b1u{7NBΖ# a6 3N\q0gI"Q#\5_vk/Q 8 &@)&*mw0 G4z*'0OӃw%X]qAPd-AȸML_H=diX$;c(ƉKxg .PL `DAI6eA*ڹ2~R fcjV/ibnkv: PQ߅@dQl6IS1Ej9KBKe4im!AR\阁lC -U-fѬcm}r2~J&neu?`E%i9ά3׉ [#o"u:AU FdC0??YL@]ml@gU"LrGd+B䡤&t?a}Jzܘ5۩ Bt׶(|okf^-C_țן+%sZQ(ax]AZp I i'6f1!px|H0B @8^CC}mw z3!ȦՔ 6%-(ybIVD}g3j#d2 6P:-VNQ#@0ã^!A!lQ2#~jmΤ*5 CFxLAJPDUpHR@cZ5Lhm62 2&:g֢QP&"8qaь77$@ɼu+}ɯ1f@ 4ze0o1@vFX[[LI`)Ӧ_Q0(~0xSAuՍ.w>{8yj3BZ[eDDK0P60G"Q-s@˒MÓLv.AYܝ*Qۓw>F[>N6Mm?4:[3{gnƤX5Wm%ј݇ 7oqƻc}̞ ᎎ~6?FFX4 6WZR1mc@aE`\0 (TJ$Šl!N5=f2 f;t $`4PkŹ ZBgg 8,'4XYUH][{9Յ!eYXv 4n:i|N!< ]'iЫ׹LZJ]6b'"^yU c O~lOD, ѰEIĵƧRH/Wjy[Gn{{%0oa "\HY B6mq#CZnF/)ԗ!m^Ao~^$i4<*. 72cėU2I(*ZDz X|\꽺q]UvrYWJtX5תq;K}-&$if뫊3U}nX=-MdhfPQIJՕHʩlc_wČOz0t4.2FF#Ʈjʨ`^Vj@g-P[z@?0|z ֫nMC!"b`?2z#HR*(y6ëjp<93lBwd=h 4nλ)Vy몳Ybh 4 6RޠK ԪΣG:Wn9's N(Wamgy̦xkp!cFaOhYؕ_vwO bInO0¡k݋:iF1*m_ijJP< CG7n7rME0eImyK8NBi.#I5j(*D$z~noi8VT1fv߁d!f = 2OfڽM rm?v5*Mv!ON.Lr>zJ;JJEHKszGwԼfh#U놘<==u b@ uGY;pY'8e7rBbMuBͨ(.կ Ah$)4*l?"f|:L*ƕptN8+ƽѵF"*\+">[67 CFxLAJPa7+D&& +rCJiLi.e} Q6Hq&5!%~[{HDJ"x:67&#1íDۢubmB(쭧X _3oaIxm7GKFYewwW5v@9v>koPoՅE $ iJeeG0;n|~ bR.*hi9وB36/ڒmFP \bOԭSx>"RPkME}l%#M&LG} 5͠.  O<ZMca>-J;V1(/skT \U5gYjZquB륬d2́9~zdI c#(l!' KV^K82(g3'jN*~i,B/˝ae!u%c`T1tV[^ :!bUZEkpMh5s ƫ&GSƷR5oWԢ\^ص؋$yR\_əKRAi8-qH< )4\;YP>\U#Nf~"y̋ ]k-ՎC1{TU1;QfߍqI:3\~6`pՁ8(1lmм߹hPv$7n3D+e/ ѓ v$}@oplSa)V:F5H E`(]KOa$-LsA>-h:A^̟_>P z)їRT?+T?xxEy0IY%~T(o /JYIՑOh2*fIŭ'*ghB#%TDF -1L`5Am?Gjs>0 q݂S9W<%,WLH?[n91ɉj19ZP՛ KyY`2.EWa5~vq(ck;H .Si p.{-@$kp'0c X"~n@kJmpJh|kuBJ̧E^@)m^j2hӛwС؞j"8n*DZk7{o݄byqV(fpBJӈ2`T/Eg ЋIH:6NE?did%Bu&`o@ .LLKRX9o;w+rf>+ D9Z{KPuCA$ 4J4z= ;umcPz 6I9˂(Y}|,bup>l"$Z^ -$Z( dTm (&@)R IBhQkR{g(Jy$U4& ;=&lB`i[jbZAxJ$ BD0" 4GJj=&1RY /Ma`% LfEH`)/;⾒du~ ɹY-w p&?y.d c7.!ew^w"+N*ĎU:W?r6ΫBjBLC]9=X0hɣro=%AC+mdI`q]iQ΂ xUOKvTmcADYH ܩ`Iw{^ڭrvړf^;ClwCOd[){t:/`msy!*|%$=GƙTR"~qw>XQwWgoD;[$(;lE6s3!K$iP\r8[D3u  F@ ނ2z$I^?~%_`6|V|)2>>eM0KEEEEh Ҵ8"ah& Eb$My.JIDiOj5[+_tZ*h$7AR vg5/\E!rAsP>&5oq0"Lx>?PyhBO7RBE@(9 U ͡2<@F |{ΤjqfԑƧkO*B70 ʩ78"̪g jU<4t813.7y'!GZ TI06^UIo;vuYXQƘeoN=:~c.,@ Πyy"h_kaM( hw2q?[ciŌ7̈́gzƱ_1c/ Covà7I)rlRKR%˒MYRiSHHw!D?ӡyqk`Bi}|ׇE͂p0FT1aY8. ⌉E{- y+ddv "hEPNԪ xzU М)b7\tW*P]>ЌBH(J$Խ]l] yo*fL)p⏹"6`݋ 磇e{0l>2%kC |a7edYbYf+ɷ21ǗԵJXϿ` w,sOIm(qXCPPkcOb;%"h&ZJD$Hل?Ո j'րL"l\S#MS5r3" DO9B֦Dr3 S AQC-:2J{+jg\{N>D`gEXS5BW-o[Vx]v8`KUGuL)gcy:sɈ//Q8l>M /MwJ(S{5{I c\FUY2.̼ƣsiros[Cg3ZM #(eL2uyILV) B'$6HPaWA3D0`'+(&qˍcSN\ sS(C:*1b<mcCQ6RsKcj?Ͼc$4B)3KnKHtI*5FDL `~c4~wxξ?(L1a:t=aC۫q7/$Ԩy).QaVӷUrpu{(1~dntRkǢ[KʰwR`P2" G7ΘZ|6wSgCo(s{~W~~xi[vZ=l4IBT(R2d9Ewхɚ2g jz-_xxilaOF3E8%$zK@ej8I3;r0a j/:QjAMPbQZ`Sm~y, I3H` r/,&,g(M3@aᗪ4$l"[N1)ӹǥ7`+C%,t`ŒJܞwSʜEmRs((sKw~7GE^T34VZ!jxNN竟, ̰ʈ .tt|=@҂ʽ'2a>} I 7텕4\ϥ$"I J`m!O33 ?Tʴ+={RҸ!HOJm>=YtB5,s6@܎  ݽ$ ĥL)k݇U:ϐ"0xo4ZڲFd&)NyJS B×8AtY1ɇXfL8+}G0?54fjͭ$ *<[pq Sx$kifS`Q 5pgs6G7 5 !dJL3$S/#rp!H^dg-/sPE8K/ɐ鴉@6+mP )QW IWmg+DQٍT^oyXR߄@ a%xv\y4o%i &[BU2juCL];݋iHTJ4@?gW)tb0ȦPk$We!UƙTtf󑚐g8S4\ Slkx2ײzYzc5Yn. 3ԓ] >aDt9+'hE5G-gܩB z/$z)!H:̀mPPHqoپecr,~׶F3C>4(zoϠ۳*[^ F+dB [wZ :%뿵cW\z_ce}GVPjMD2g(8G@03\a.GxF)U f" u,<>!^~\dFnh2!̍ <^p̧$G+Rk-bnWVTT[n8uzX) Xmt4J>C30qq$X!h fd-! `P` 'Jȋcspx Y<l#;dpiZJ``S/tk+ĝgS3|vN˩+E7P.k1˓}N pWj`ž}Ȃ@[..9~uZv0r2b] qjۥ5AE2zY_*BכAK3o}D<:~_@B cN6@Q^zKf~If鳏Ls5rIf PILt┗BK`*R B%f{}!\swcG:ng8\1jPOr|-?J3Yg$dUcfQ]!>C~Zm69+.aϻfm^&$A6~ebƽD^u".UH" s:/喜=Fr=ʿFwf A.60Fek~oÓf7Ώf2sxyc7Q4tLJ׿Ib)d/5-u[A8]ѧn(װizqXܺQ17 et̿^}՝۪iJ˛?F39F^f,f6Z}MGNlyt}߷.Ͼ8 D?e}s?gwwџލw?&;룿|/']L-}8'\ T-lےdƹI 8OU(RA {Яl8dXʯsQ(#Yzd,K=;H1Z;;7 xpH Lgcq$!'c^-#M!fi,2/GIyDGW@ܔj٣:]-%r" #ΩL.aY 6x4Yw\&HfnfÃ.H)r^[0Tx$ioUrn*  3{@HW6pƎ\KO@~ #] m! zP`a I/Bc`D\ yhHZъKњw .nxY/U-&YP:pKܑ#dQ Cʒ25!HW#s t:_0L@?!b]QYe}"#V@-:XyeƛhD:K\~p>䟈'<2{+-wad(jrj3%*d%J"zA g\mo hybn( #_Z.T4ʙJ E6Z q;+~YHq&e`2|#ƪuc ϭBU`Tn &޲KhoBV2ܢo4Ѯx|㽮A+d=S9}T \ӕOIb$u3"i:΂&M~G,>=^ g3m ܁f/ghlށZNX酨M]C&בQfps}88%cAO[RSҶÌ"Cb`)aT\Ad pV{=õ] rھvcj XApޑ^e3MËLe>ɜ-pFTJ| }!Y4oB<Бg>4!ROdWׯ]쁜_.ag緻w`ٙNqj2 \SJITvo_Ǔ +,߼ AI#Ŗ9%tZ!Ơ}TPDL(vU {`Ev*jΥ#)ɻ@mm@[0z=t,g'(mAQM \.|E2Tiٚɑ@T,6nC1Ft^sw‚v!l _CEv'ʺ"XʢHϨBLFV%0X mv}Gj5m<$DjJ` k9id; 3pm+<^ 9_5'ͨy9s[&yHGEJʒrA[]:U˵@(֨)LvXjZ*IVl,z$(ZX`3H)b$|.3j+ oSTuA*4S19*+-]@ZP,X"lovJқftfw$!?3?j+NJ[~>*}z].tpk,'ִ-۹Sܙ}Ph Ak阩ZT-w3MшOGzBxx>20+9wս8ǃX+Oz/$j:Dbx Q]> Ģ&lv8D=rGgfܜQqY~^.״ݧdOxo ]rB;ãQ2j?99|_v4i;/W ޝP}AjOl6ɉYK Zl4j!!"miT{ߎ%|n%6J# [ ";kxp]08mNQVzB+9\'G'!젺]s3a+ɹ}Wnb5- O>yx!Z=NңxPI=:ʱ={ni8xdIi6q )֒U4$'jf MDzX KV*GlnLoi>TG%k)i:V 9!$C.豛 '99(vN/to=AmXm#.=)]ض Fpn ;c]*Hw[1Bj4ظDas +y9ǦI_چĎ' ;Qe/#RMR=a= =Nn`Qj߰A!i-D&R| ~pT_cV<Opt|^58ߋ] ;eZ s1ks6i<89̃DYrĨ2Y=&Z2~e1h᫋E{eL-hJO;e2@/@ސY!dAN-hZGxry*P~:o_rViIr=(V7VZE0fzޑYHoT+8s=3q__s3V^2DVX~l]?ˠqs7fOf̻7scZ~>]X,Zevxv~qDٕou7ٿ^0kc=ҳ]r-ֵ$(,mx#D#mX1d}7\Z9׽Vusݛ/:G\ZwdѠŵ31oHIQ$/ ,(64oh-h5L{'2/̵ eKpʶ H8EG@zj> uZgݛ:bH_~f<]wtl;ј>T%(T!&H5X-j.*I' ߹q,]Y.4>#Xf&Ο߾FFzA1YEI(~{N,ϕCVlW`R1RMPdʍ*&-ؼǗËEտQCD`ƛN^iXR>Wi 'Y4XE(mG&'%y5u=w$: wc Qup_@] 6Л&lC2.g%YXJ&ʢiK;D-z 5%Kv*lWC:&SdM /IA[46вT8"cQ f-3$/ A| $댢QgعR4B.. :PyḄ9*DlRjI$rL3'yFPXZŧWW/]g L3:],F ^`T1"TA dZB4Ej0$ad$M&d%LPzON(kdMZ!&Y&5ER8KAvAzZCRljg72?k Y+5p:\Y* NjkX,m+ =m1ʒ۝~=0z Nt\2<"_Օ\vrzхq LK&@;nwV]+i#]YoI+^V 3Ҁm,eeT,%+ tte#Avm'OdW:1CUCАm[,%ǢdCiXۘ`xѕц(@g-ɤx^PLrےI.EI AtξphC3n0P*/@~3i0?mV)Q|C'0C#źwڴ1BlE+UWOJ&urNӊ.cyK1Ѩ F6TV)\}xu[.E}J#On/u/)@'f_ף mt&ՁvIȔ,ZwW)^ٚ*@9&">ךLP+Fjt U1Y0c5XYmj5e{cҦZeT0(FmvYRlJ`j\2T9`J +xRL8}^ZIkѕ.IIJbrެ:aeh;BR֩eM 4Q 7"gޛJ/I.Y_VM/MlbDy8(Z.هj1]#>α)%, ZF7ގC8^ L!b[̤&Nm֪EvZFIW4L.A Zk@*Bɪ荫DZzեK(*amc`c7tYG*c IzZσbq')8 ьX.d' YhPbv~\'~zs݂_\vw?;2\]eWWёUXzc}q6Cu)><^n;y~]]}łڐʟoGy卄!]X Ct QvT`鷯]d\Oϛ1ӱ/ޣ_w>|cL|'pGpv9}rR * փV-޿ŎղJKRD'u&y=ivrY c^!4-+t V]$ɐ~BP, C]`<@*FLD2!SYF}٠x[ kErW[ ф*S۷4o`)g'cQbDg+IJW +٪Yd0*lŽboS7qUFM2JAZdK*^ϭD(:FɶpXfmbIT>BX !@MC"4$.f9D >(xF-ALQh~U=y[9olERa|YK:pOzb_52(3 j:wTLNLU65xc| 'BTJU^1Gfʥ>d{_*/ HYB8?4bu`T~/cz6O9rx> #ߌX~`~4ڴ;V ?8]x:è\ EM~ψ!/. sO}Ɂf Q .+CoX00!):Y̿W ұŹp0-ޭbytDZ9XbR;HdnB?/}KX[lVN{>Hx"ē |U'V9FδJɑ͇Kv͛S)}4\ ܱ7Ԧ~썎2¯TA]M\M~8XFƂ{xgK%[ Iy.+Fs-6[OO vzغ,mYs!gXܧ}sm4 =HSlkަfrnU)0co4Jځ*ej-p杈H>I1;T] x*bR'w)VD_8'#g$9?@;hva:_7_RRY9k,Hp~݀n̈in̊CA\"}#REf !E?o2] ,) O:Yz,ĩOr5jF#sK9#*ܻ>S@LrWmuudvf!co4Ua3\_xN#:xMAI#66uj OI}vgcH,N=IB$T ̲]lb K$_r !XԄDtueZRhT#W[|* :@;ű7ev-x_|O7u]SB%m?ʽAwӜ2ocI|<8N wنg!\c:OLyhS d ;X"^TIYWe* +g~RJ&lH؏I)CH~_8בvt =Ӂ桇X+7MYׇ7ƾ& + Nv;Yh"vτY|g;T8!rhU5MI<&B@:bw} 9+F9D&Bp!KFé/ZFP_O:iyX&׍{P׉ؽ].NVs~ctk'L&~~mlU8/fr9hQk]j"VB3>nV`P!j$*crl"je.y+ȶKȵq54Ӓ)k$CSHO;[t@6L+RJ ي6D0u {۳ TGJd0)C!ǜF1^bE :F[-:)' }X$4NI*{6X <t?E=8_]o#k8Ko|̆o j`V=vZE7]]D|gJYYacz+1ꔪc1|j=|w`OԐ n·+tXL%~=O?#dz3>gDCtG7R".>5s]Wl&6C⅐E19Ƚ(!{uBSjS`ggK}Α{cЁ iZNőd8'#`Tya4c狓=-_:1t@ymn\(9wo^~54:D9}3P c@ @4@yg̡2Q:f8BGΌ_W* jok]ϑ]W, )5AOiQsR "80 Mj#tn3=T(ziOKG*ɣjiSiJf! QCU~f[`D"*fAddKbhC4r[zy~vXYHf!&\ :,VQ X} hJ}J q숊]Q%&l#n6O2+ 4J[S,K{rW8Ϛck3yl -ore!CzE ?SHoe a|8!oZD WNp%j= >RcIϥ\u@҇^((yʣWCO9Qjn^z*P< t_r.G#!lyo>7K `" / Ez豐"ǍK2F N)l}m3GaЈd#IT{[}X( A2`DzڡY@ňZޯ6w@Y[(BZFR1`4MbWͩK*#Vc,(YAD\mr?];p+S)wc]jt));ʳ$ߺF{mtTG]ĒiS1Si62 )xJ̒:<li&9= 15(l2):B\H8"&NW n nhrmP}qRSGW(31PQ{>ny6TWR{UL8 Dc:K95Ffʹp((T ЛL VfY]x5$޾;mC35*zGNU+Rhݗb%tF10h6 G~z[PaCSȅH` d M0JFL ѾYeWgCZ+(Q?Q26o٬%d,Ζ6HCBq|NQ>]WAwzD-Zh 9}o8\ŸE"k#*:R;"r#OujܸxqNFD*2ov_6r't0 ̃.ZsBffʌ-]33eHG5ݽxn` gMD{i(z*&_Q/.#B+G*S\"ð \sz&xHqgȚ0Ycx68<ۋN/ׄ3fh>yuZՀzWsDReԵ(w)PPbMcI$ ̏jy (pj.4:c8|r%v9$G6p8F^MufK)!}-0'ɀٛ 01d ǸH'ϐL^mÊZ+qvy/9{Lfp 0 ibJBQ0SdPV2Ξ)$E3†)xJS4tv]1Eżj|'NHI}HZp"RŜDu$ )Ge^y`)F5ehkmIb9@k^a%NBc\[|)}x1K%kG"R%Qv|hCI5,ٍ&H WZύ gymk͡7:fS8S'l;Qs^ 1"Yo\WOU>q ߪDFΰK +w<<| =[I6k k"*~7X$Qca!ևɯR"kQM7p? |vq3ӗg<{ 9'~I>g/{~{?x>9gO2o/&GS/ҽozH̺pb_])TOU[.`~NN͊>zn~s,+ê0AxX ٭DQ1^zɣ 7x_>H:gɭ7.6AX_Vb_iwnUS7K)XSu2$_rܼ$Q!hooһ c[3@|^CZ/8ٌřa-A_9PӅBq vlClZ*UF~/x'og#ƍU{b6ǬNg6{,/~g/޽8(Qvi͍E[E$6tɥfVy٫)Fԣ&+h\Gl-WH}vAZbevLe) ܉Lx|nW8ޞn HB-&šx,dc`#c.\)6C)P1Z}~j{Aְ%F_J;r9l&.x,Cb1ZޜS,f,o $WT\>sGTC\(*!LXm$!=gӒ8HUl˴T83\Q(9RNH7|oKO!Yr'=|،[gڭR(ҫ&Boi`ʆҍ7h˂C2kDS^'FI wv1 O{5ٻ8n%W=IۼY%Ab>cZeY#-J#5=3VkCVuWUOLfLIN؂qʮGr-~yo_? AW{J\e$eZc]y6QݑPE* ^&)YV4w,!__?\y,_{u1j~[&:L ԜW37QR%cּ3 Oplĉ4>rzNoc{JUԏE+㹰ñ+y|:?957:~j4K2$PDu0>7~cN*P3.ZRo'qC(|M,z^w!̽kד X_w].4Mv_v+y0`cyC:u~h :S E-T{L[ǢUCE}vmNZ 2V|3aա^,Zf<%Pw7TњuvV҄!1J_WWֻdu͓oǴ~g;i2V6D _/34h2%ժ!r1riՐy\`9bm}f>b.&8:;BZ,k.TIjj5G+:a>NiY3$6z=]ؿ肽|^cw8_͘sZŻ{W:2zךMMzkZ,or1mMz -A# D]B%5|}>jBuqʤ/`i‡ZL61z*ꖮEjvJՐg;r I얓@bIcwJ] :I ņ@WJٶVy06~W)F<7j{^w" ӭ\G:b0iΛx@RFj LQAZsMd2qX!l4(:犀i]&\-X<-d ,톆 ٙxB2v nQ!{ E>|CZ!&FK0Ks%-I~+f]y5M|4;>fSRW\jؼcwJdIlqDdL"[/@hQRH1$j Ȇsy"pQxJlbъx.캭Vd1b ֲh!䫔wLZ k=XcZ!Vz3"cˬxZigy cf=jP] :-` 5!SjюЪN#l6% Qjy9#o4!Sfɶ4^qv-<S9Kf25'){Gyo΂:dسK7`ic@I@u?q(nESbw߃&r·=q|Z@uro!Enp'h|F(A7D5kH1tkٞp2 oêL@ x0狓֒X־5o=PD$] r|iS;%Zc+ GG2c.6zx%Il;].WbBW\%@@V٬ *r)6H~SpT;@e'x@FX y HvLM} B(i R]tW Pd^w}gϭvSU3٢}<\!ևE+㙰h5=5>F§7}7civ񎭁%e: F1Ej:o^n+@/`^y6#gu' A14~rMeS:Pni[7~e tlܭ8 %wZs.װ<'B/I7 ?q8-V0 FۊnM|dvu }yP)v07Sx~åz4-'-^-/ȇ,bL/=+Wv"'^R H;M XޝL ̣1wk/VDGQ*-m-dvThQR+Z&B:-\"ݐ=i&ۦFz:axwdG)4LՂ B?ۙx^o{wͥz|M`sp w??ط.B4`sHOKŹkWe+9&?Ye`O.)hRGhBb=X,J)Ll+o Ea/uV(ȋ1TDKV(D " KioQ:)CckA-Ȥ[KLyYL<ĒݱCәlbpk:.[G։V~\,VLerAcQĮWhT;ɖ:eR.%gcv!ʨ˜"E6'ZJ6027&T>>, ֣g_g{b ca߰U>x|m>8c @2&6px$-dccI7Jw&*k(Y{--_ OlW8@Iۜ "в[vz~;gE[VGU:mraދ<ҩ't?#Te/y??߾}31GGVUa_Xr҄W6Bu1NRsxvv:Kj?cãNqS~xU;z=]0>?W3lU>:RTLƛWWs* >1L ũ&?'EU>SWڟ}_[o% ĊvlA*)HEGӴ W(>R)a`v)xWŔDM:QQ YKѣ4F*z/ ȋ4+swZ#G%n1)+`t)):;Ό58H\-ulRxӥ^W=uy{+Gx,-:K=/սY7u2V𴷢N-.uF/I3}md̹;7,­EwGy'6z8KX"AH+X[cGq7vRhR8X ^3%ռSS=6龹htzs߹7Wڝ߃[15?_Yg2b+>W)y$oBNDi7ɡo=V4^ Jzo:t>{y.b18Ÿ{=yyUM#Vk5ӓ#S7k(=R L} 5$Ŝ9 '5H#iuhIw)f,FqW(vVʨL ZQ࣐2BԚZc}sג \=ZRhM.j$|ˇ sM2 5Unݥ!⍜Uܫ= Q~  1&P5xc z^b! *͖d$La#'EE"fk*X;F-J& <>*9eW< YT, %h?E`Ԣ1|" QLY^-uaIN>2E|c(6v<&i'hl@na(~9j =E6J`-w,HB*vx L5HާO.˫5F^h5zhLVfG*z및bLPΐ$h@3{ڱ_5DCŤeq񡕠VUV#\}p>Wl,~hѬ~׺"ʮ.w{_>] Mj@FʆFH @xطyp',JɨV9&v P/8F:MY=otF8:-#D)87zڑn@ ؂g_SbCZ˶P\Ʃoc6z"!LI)s^COR |̨2$D*($ц2x*y!7RRG/nFszOofv#`Ҹ4_"!^biP.7h}4څdAM] %L`cbV - h{vo$׊ K&hΞc{* }H+ &;N@2bUal`E-` ,G@o4%sa=WX!5Æe썴bG\mcFfm#dAc`q6p>DТW잊bcb7$j^`i ^[ެ*zA}M]7K1^=pE"JU4cɭZ(8+O-9?OҊM+wûlpTfiJ1Wc 9,)*LYh̕VA_Ys}|EmTovKJ|-rUr}ȹPotFä 3r_|DH"PW!}os0f*ma ETyhbXI4D' tW|G'GK/JlBO"\3PZ(k7^Qdk1#1Ns-He}]opa>i46l R&=DLuqВ \O 1qHғ[^|tW'3?`Nw52L,%VFި%':Ҍvѯj$_fYBPV!Z.fN) N]IPSw-aoj9k1s# w/nkaކF͏Kn) ʘsޮ&c&zSq-@#& `E欳 zJ|hWZ ]ޔꤟή_zɺ@`.;J9+žߞ;aAbP}Ydr4FVK%!U2EQq\::͸ 0K,NSJ&@o|Ջ_Iݍș:ѣIw?0LiҘ"YפI*o+Zi78%Gr( ^''=1fҢAJ0Kކ0$ʻa}@4kuL3]V;1bbRq&V lBި`p(<HuQTN"+dyU^[[E/3QX؍^Ke H\3QT%7HGK82#E2##1%(~GQ=9OƮhv[FV$  XVI*q<̽=$L=j1'4`Ax hG<ҋ*1)#K5 ϤbJ\:Hn+e3X2t K[%,']pBH)K<)gs$H;gl@BbG,ͨ Ǖ5,gWR+ֱhvZFjU'nYH#{{2@-s`4 #eB@1a@.0 P]q 7CJ-أq5s~ &8 7 nj05#i1c Aiv'a;ORL(@s9zq~3.S"gj't gUd@&jqjNVxKh818kS"k4g.@(O9qd@ލ `=C e)dZJXRLpt q:Gi^2DZZo ՗;diΜ/פ6\C_sXt1g6lʛߦy} Yޯb3ZyPsŀc dPr) aPϼ|Y'aS*1!}\̤}ʇTTy&]{!n Vs6/,feinxъL+5 b|DFIgg"3doi$3Zm ]\Zlo{}7u$È]^lNPkëhU@劈gbGIpOx&iQBR<~{lNQ1A:GK"V-ʓw ~;z4ff0(ע6LitɞL! 1O-: g<}0d}=}'Rz LSJEjσyWf!s#hE#KYb27nx3)Z@燶ȎM@sG4,7t/&_D]HOJ :L@Tnw%5^0~^O׭r1S}Oۦ'Ctꉱ: [a xC+>|ioԱ¬*.Tq^eKꈵՊwus|hJl&ɲ4OdqV{ FBVdY:4m%ٳw~2\<cXt잗 Av\%sx*Fϔ`@gƁ ?α2%j ` gȉm>5Qt̎_9/X"@eHɠ֘zO7'N}M%ӊ,mdڡ2|<r{֊8+qB%yrz' @9sx}:bK@ zxsԣJH$w<7$A?Hglj%50i _g\dZ1M m&8BgPw W QlgcoHaZEE'ԧ%RXXE"Ppgqm,& ۲q*,$\Qqtikۚ/ؿ_4Ky]/㻮oFTC'R6Pr].~::,v~**"fFʼ)JՑ`U b.RJ̝(v 6PY4h"0&h{ !F]l98u:f:wB !rY<[iSlU6N-ڠ4 aRiU) 5/8Ke 2{Z:,tAQ32x VHLBS[Qv;¼Cɖӓnj!?܇߇6![:![a@rGubM|^g7Y=6yŽɋ._L^ߵw|mhw^_[_+?zoCVG~f򠗼$Gn/x-^;hF~E|wTџ?:"B? (^ѷ{QgFmc{Xdl5E,̓/fb$[Y;M'NP\x$iem=@_Y>@G43xb\JS D&drd3RR"v٭*w[sN\22+c#xdb@gUJ0; Ny@ȃLU;;a/(e)4夢p w$U`f1ZmI׼@Zwbes΅6[C촆.C ,'B&!CsfFdB%O% 1 o;ah;I=YN1-:2Wַ׾Fj?kjO[s+$Tz6J#zcȅGZ %n-.jIj9|u8>v$өntޕ6m4҇Y`?x%*klj+v9dɐ|<@JIQr$bf{z $ یvRrůc lbЙ1k#R0.x y*xBe݂aeEvmRE@܄FCICFʗɛ6mzkUdNRΏF8H +p]'нk#tZit /J't 7bpŽY+KW%'˨k˧|@],uV:wi r B9/ >rnRT0r+QCϯROH8@U~};VJ~g4LhR`y =ە~k @K[ 0bR/e YܘD)eQ-$M}b-ļ {&yg- 7E!  ,#"#Z%ke!&aR5f~?OUöϓf9 7<ydnAc}zXT C1ed%^ EƀPz$mы,3荨s&Ñ&C#riSt)AKءP,#_'Z ^ɨkƉq82+Na.ThK]Id3=<}߶!$/yI4qs9ml.Jjɷ]ը@h:nG\ζy_\U悪5!!( y9Hku_l#:ik-ضOyJKI"4t2<^$0$6*R+&VכXs5B4g܄x \R`tP{qOCۢ~: X?'7 Ň $R4yf\xBcLmzoGvy[-fw f[1GTY5xL`)943.!+CXjf, $n3RtdJݦ F陳g΂9 z欬g^`7V *|eR LjIèbmVo,*Nl|Fvq暰u`Uڤ)g~BX<(B!Dk-)`iPU !&zsrޥ=bXwա"]FbWӛYf/X?@|$8[gw&QKu"W%2Np=CqP;1 0 Z+yS,^`9BdgdfżMŵ p$q[q\R'5JɯV#[S,+YcJV$T?Uq,P)RY )9af$  epaI1aky՜i[QX~RV"&! \ܺE)ayXwUVMwkn L^~K*HXQPB1mΧ~ؽ= e>~Z=z6!n}䑞\Y{ɯn5U{_q j#{< ;DNw]}wsZEy̴n*ߍzɳo+;wL1j`oQb_R RPI]lt+DR C򊄈<~媍=|p{IX]Eqڣ38IagAQYsmj*| 4%r"+myһa7<\(Y*=s>mWiDRqNиꡢuȬ>,kg0$oH,?k*X;qSSެ<#8DÊPd6DV5+[7*/$C̾Mʊ# sA#:n(weZ&c4oSMUս!}=WQ:qBRڊ>5Eb4H.JY7KEUTD\sYJR;} B*k&1T[K\yyv_Rty*!f~HN%w?# _'1G DһuYz7oXsW7C4g6Jw!-h2|H)0!o#=?ORI h&H_b#g!J[Uz5n4zv4"9DSXzo\nl ޕ,îwIO:M ehm8/|E ŧ..}tX:l%$X0^y @<|Jn;)4!/(5'9;Љ_'A2|+L8`}ȳdd9hlzdw'Ó_ýFtG׃aNp\?77Ɲ٧7ȅ=݋dq%/h/zi"4:_? @us CC}}7v3,4; \yw! Ѹ{ ,usYߏkIV}>/z5].u*vٟ& *bq.D~gq̫'΍.c0;[wu2YKMtAkv>_?In#:?\"TP:?}na28˗`G+Pc?}_~y5 ?C  97f ~ տrey7}AF.ʌM GQڏTϯ`~@5 ʧPSf +k녚+/W\MO8z\Л<,1+K|XMiXu]YuvcY> [x}¸OLzdDKSYy64vZ{]a&laEXRbph/S"J(~3tmyE9uhe1yeZ'ytkڵ³L }7R3>[.͛Uzy)z/}w^!D6Ge r>3ƥ %b(iB3Έ8YodƼ7dݞ(F72pXx~=#= Nh%8Ud_P#7f uHg%OEHM|䩐[mɨdNP!!a/<8$ەDWf O7 &G 4KXESN51ic.U^8N&b5)TLC7At"kq`'.܋璶ce'|D6 ^QiRif.# Ƙad4bܐak;EvĻy%V_^S<ʼn' d<@KN2H&J/^R>Kו 8D1oɹT-[Y5 z-^1+đ5wGKO\R}땱:#lip{>QJݎUN0tx۝"(^S--Y_) n9ܰYcnٵ[}Hb{vi0;Rҕ5};5^yl o z)Sץo}~{/nLx!2/{.񃋋~w5%(Zw.񷎴qo\%~4a߸7.])qF\0G>I= )Bf i\=J0#4.7auS’FH&}V`c_e2߆뫐  >7p}7~-Al&XO"d1pi˨K)D?X_e +\'; $#.BxP{P,/م`S\'ňguGەYjaXzr;/ ̊)r2<;M'ap\V']ڛ||inkM̰Dί3P9r&B>H%%H+96ώfAn$m)&s%,O*a0R28e y=oveoRO>X D[0 fvD %1Z1GSTy99jC?[c6kce.rLp.,85 K1ii MsS*˯b\6z"9 #,5C%-~}zMHJ%E6(2E,A lPd%&i DiyXu2aO,4c$Zd HRU=jrY`y5O7y|Rin7XH}K6X_}uԊ,&*iVbNRb+XmR KjLJI0,y{e.cr&R]:̌Rd6ZrR~j&3LبRg܊ȤM%"2Fgj:G-:x*^SYNm(b\jm?&y E,ڍ^F^VZc}cf=FY2޳,$= xxA=8((3GTgfqgS4( kͣnCnKAJu01|2r23/IV#x,%CTO9$%pjYg4S,(8itltl<^:~_KKU__ثineh0{&ϢU_X+}?Nβh%d>1-*<( <=5o'p0Eaho촳)Xq=hgtA&i} mxItcs"b2gKr$,IXEŚX-*Aye$!m iOWI?;oJ(o(l+>+6EEÅ]lwTM9U@B(\]ڳPQo~%B 's-|ut^e6𛜅o_&*'GH_\~]"_ )Dd:n;WoJl)z]0KǓ_coS?{ȑaF~!H6YK~ =O5ICr(pȰ+i4Uu꒟ XOбe} H?xȖw0"\Z{Dz,1L%L!` o+vKH&H7>"1]gObTS(XR $`Hb`IsxjLd f(HW)~exM> Sc{P9kݢm^Yk{hnnyT}ZRU 2̰X4҉DKQYg2%/")Viz)XnxiJI$D[+~cʆ [%lN%Ċ,րEK l-38+SiGdypfVSq+>*RRѧ JѥBgOQX!`y6mp&!s|bKL~8_`"/ \3Wh2MT)}4Ɂ!@VȠR e:oiRCd3`GH!)眔"vdE2>!PeH6epцh0Ue&*2ڲW:ǾӬP!9!]n.=Ln<5H;HfVXg5{-L^AmN!fusMI(hy%v>Jz4Pi؛tͱlk9 zqu9}rL᏿x_ 4 {4SqΜ&vtv{ٿ]^O~/˟wgw)N"<%ȏ*F{QMe? Sx좦~:<&A> V`8[t̝ܮQS7H> CH15 IuJ@SJ53 ]rF8IE{,Z|g,HX`;}T7e{"lGL->TNŽAjoϖŮ)Md}Oyi+p~Y.S Es-ӣ(HcM0M >|.-wd)h֒<75NSsqhsB_fxUFЮԬ NI5<44Y//[d;U@gѡcPK*CCQd!0EkٙM(oc^E[OZn\5=xl֊u/w׈gOsx@A$ {b'q.hpT^k\w,558wTf 5}߳I]k6ϯCڀgΛJ)~:SY;v=:_9zf-@[Bz`ȞUh"xLVn=85s =ZhNZgٹGXlm)xM}͡ZUHt{ _ ږw:etC{cxunUzrFG8WhQz ceSY> k7DH-{жSJeަpda=}&@ڪmM,;L5*W.zt/ѳlVMɫ@<-#SK)7>[my'sJ0B[rmܡ O]V,ٔ;!^!w9+[h}w|מ%9(!]E Y! 9USY;9b:q|4G~v<>p"7Cn> %Y+JHWМ7h+'-~^<;dBJt$ Xa'BO{T̏q=u8)Z&w5 ЊV*lcO{DRamVÇeZumꗓ.|f6Wq}$ XoJx.vSz68ÁMuPYǟ>oX'"a%f.#9=b8C9q0D&\[ D$aɲK ؓ2e"qu[3@™%cfd?#ہҍGͩ.\}l,V;OؽG Xi=eHQpd%Ʃ flp _H"fW{KI#.B#CQ"x0HL l<İ:\CRI-ٙܲ4oGP*G4L!NAV&2)vQ1(?[$,ZIGm8[ [{):%P$V:Lޚ *bMJPGd{=8͑,E?6)$bҵ~À'9'J1)4^WJͮ(e#̎e:݂~E%!:)Y)#VGT UPQ@)k;&͒ʃ);/67;yʌ8X4Q(ψQSOʘAp=3Kg$ZXXfq թ75b^[vd{[7% ½ҝdeS06AkVV}9r% F>Ҙ2~zRšJNNx!btmfutoBtVoNQOLN;g9Sg 52*>LjSP4 1"f 0};~;k 1!W!N%ߜhp7ʛ<)5ݤ{_,e uI bAf2'$?Y}ð5FU٘c&69eS7|P#!~p<^h,_C=82`#x^툛׺K1w\|]}yBߧX/TiGL@_!q(kаr[2li-oppvb[Ct}At f퀛CLxV.8yvYӟݧsNoҏkMꟀ7t/Fh Bsv4i "L04xZk0\jFwd rs =F1Q'v=ǖW7W1;8/UG8lNmz] oH+Jrٷ$;c$c$so30d3֎$g=&e-K#; 8ER쮯z?izv 'B?rum(BQ$q4OpJ oYq,RG)<1LώĆ> d0/,P|(612EL3W@歁|5t[ WOwڍo/oV$CaY: 3<u ?˜\|κ;?ͭ0 L:*=k6gޗgP`_O&;>궻ZST c8Ȁܝy`A5`::1 &ӱbWN;S.wGډ 6V*bQ10Auan?gumߌMnůtf'Rx9LZ '$LbM#O6 nMZ^r SI8ٸ3J{×\zނ΋>|9mjn<{% 7 =O>!\:"8 2;2bJ4J3k%1:#e܊D2A`2|4H;vdokjk^r*{5@&lo[`m8/$ş LPڮJR|;q"4 w׼rE[7*t>;|V[lֵDu>B૫ރ܈_lg2w1@ tv$&ujKZI6 @n  $'+ Q n]ƈ[:DTDbӄ PA& (4Ɗu5׫Bj&:N9&u~Vm*^Qs'ۻ>х21 {mMOy}m g޹{=}ȭm}eCjew' 3/WOXH!Kjkσ756~ٸ`PYr/TJˍwSwgx7'ӛ“xAeskXő`{gU=+?waVsc(0t^n:įjlIl3ȳ Xa}7@ >DYY f^ 60@/zHGJ,ձbɒB1H"4eZXcI=Z/@F.&r7Wy͇ hZl 'Awx+_-6SA)􋏛W!XR1C.%vqr%ceX, ##MG|u Yd|sm| "rгflvOf|YRWAx(9ۦq?vwVU%FcK+@QRբV*"Q6)mnG$r;v$1c\n)m(j=*?բLzs;Q ފ(Vϊ`FOLQDTnjަVr[5IK qTڜͥcp@5v;yAF~CԐ ^NG0Mp$1mHбt,tRR(p{Ǻ6% jbBIۆpmx<ې 7Uvqtz6MlRNjD* !F4ȁ|RpZ֞?0ymm:Q=b\gh(7}cMl|:w)i% }c? m>wg#♓J+!jwcr. `Ltx߿֡S|T\ 79x17rx$#DC($"=!FDюagͣC* R^V!qg@@t#kfG=/QT&yoW_ a{Lx@Yک:2m8h"\?saĘJ A8!~p B[Yme¥CΙti{:.'ry +ki97DXH]5z-ti9q !~9=%H(5F}"DcѼ_+4vIѢ|^kFŎNX ~ךje& sNa&I ҂Ԙ%(%ڀT:JjI=VK=V ĨC%ߣkWWp7Y"ϰ6:gtud&xvdz+?Nl&g l ::?5-gv{[vW[zˋכyܵ/-xi9fpzuh#&]omOU/_F~&޼ }6Q$0_lܽ%zoII6 aDew~>+1]nO5r|& .q\aoZN]I tjFf*FE耾q*Qp.R+_kAL#y&\0^+Ž;|V[lֵDu>B૫+0s#|j2w" :kp `+ƙMҥv/$C%L.W㇠/u?tcq`4JmJی|ևiGDG&dJM<78|_GR"t1&[zVI222PZ:^3m~ϲN~JvDconܮ7inW@с26&n3q[' Bkhܸw^4x,NPsxǧY3r@顣.(n嫭.8iP7֛X?c]XM^oOXw'7֏g DNjmt1 [8gO[8GUa_d Xp&`MDP%5IX+Zʘ2;p<'k'~h%|T<I"FTG\61}R cꧣlFze[i>S 0a5yv@sc|~5 #N-Hf?a X^`@~}sqIK5?WCi\|'4>`%~} p1 3cV̾>҇EpkwwyJ{5L>^r5=<&󜟣l]&"R˟^r9ڙn`vK`3-Xd(Ol^f~EPا>Ť.\n&]qtgzZˋ\Ε4(LLeqDng3]>ܯUP[RS%xalO'^~7 fD< f" r-0~Cig\*<.n~]JaܛK].xz.u%~Pq.u%~hKO 7$E~[\ QASIFxCv\bA9k&έ.cSb@%X&mVqd)2`xGNL\םǹCQ2'N7D)V۟ J-K4m,.8xU+wk]{qψ;Y X]Y|+*\)]N|E9vƽ.談hQϮцG38f[s+%-/Yw KZSrW&ĀC/0mTD9e`17$ώY]t&;H 3u s@RD A mk[jswrtڍ, \'?'8]:'pݛo^3HwWEg&Q8k9'0>̎q^;u;_oBX-`WA|tPvBk: HڳYƺ]U ys۷4NFɪ킵 4`bD<7}I[?k0mGe-u 99E}vʼn˳8>ǪRU`!8[bĥrAm|m^YW7)ʘϦe-kpԑ I_0%]>Kt%{+J|IW_uQXKX`mKR,)`m -JIpGJJxg1xw$_6M}mSmJvB;V􆟊&n:kWdM0E,~*#6 ]m/LOFQ79e7ŅyvlNg䱫+e4E*5 5p"b#DcD)IR%J11gnb %CuR{#Cدum\kIe A B.7všNUc'o(놲j`}yeATrqL #L "LdKnI kP6,ͮcg [PE7IP:67 Ij*E\Weefeeب8$ԋzBj%|P1!|C5g:e>^#n_wg/ZG Mk:߬MvKv^絚H֡v^Bv^y-j)e]ZENҗ^IĽz8A6f/J܍6A iCCt*0O!)Crkm nwZ][~[e 78,N8b``a#NKligvf\1 ި1bXc+ BI"B!|؈h~'וkv+}s -n}~hBo!%w ( gpR[]-OT[a55Kyoo ~lA +5Z*OpU0{cѧ{rMl!'"/aRe N;Ot(ՌȲr{${51!V /2,8(YLUlr"[ /4Bmnkà(vP*K1U+F [yf1!srn2tKr1mďj[AĴ 7sk XoQs'TZ\zkSޢ6G =ڈ[4t(PEMo28!-[qL}~H*UϧAK _i !Ć D _$X`Ȑ yrda`.DCm'M7"b? jܑZ2 8WRX&OyMš} CL c[啴a7hߝ.i s>Ngq0b*}sT <<'}V7 s#>hN|g>Dgzvq24AίYipUŝ;نVSR;i]#)*u{֕K#ĝm_')rC g9si% +'%1w11ty9%vbq]1#$jC4Sy,DDLRe!%V9#uĶO,y6/~ҕ]&Kїe{tїF${ɟ} oh,%i`lQ 4a01L Q,Ll) :bQP4D&!"1 W*$ *m`*.DUޅ1Q;1C>uDx00L(RqjǠWQ'9BP1C[A*AĐ02qKEgTO3L7Pkt[ﶆi2GeI^;/]Q FLvę&YMC.')eeY  7=5eJRRvWZ陸s``S3hd ]TWٵ2AU $ <\ (Ћ}WD:pBVs^1bL/;C+Uyp1z9z:0͑F>O$gs\cyXXXXc)hlg(C&` DȻH 0YW3.D|[Zmn`MoB>ppS5춘jA.a b!bFHFxy~HS/C[Q7~w+%+`-ޥ_tߪD8L!Px#?/p:I[iML')֖/R&RjUhoh#"-MF}&ܽy#/G?|{|4-sx/Gΐ}‚<'SW He$yhSQ(X&~7O'_m.C|\%KCi~go'He glr{XvEl8zE:IR[ߟԪutf]18>Do?K"V}dIHKx3>F& 7pz,ZQB>_ν}G[&tYtѱ4lxĹ!ѽox!!(ut&u{A%x҃oFYi0;Y2 4삇XvJߨ,3,rc W+Sf>9=huRvk9ᬄ7M?G<F1ev( ;אogp4[Ho'`.{:ַ5*j5Jf 7IXV~mu<:[Z-uH1po+`Iԫр%:~([M>wM=隔ci$aS]T( %c(exvWXp,MIf(Qi5AHa+en&R n, uc T>a{J) G`Awd^aåFäeEhθdmҚ721/ |KAc |&6BWJ5f"J_|ShDVz6U^bۢۚ#q/"a=pJ_(dfGMa}f\>|gl,;"'- {{&h+{N8x}(~ v!Fw/*P1Sټ?@9-6g\$?6=L>:2iz_k\ ?GN1\\K}Tj&21Hi@cnQc9?[MAسhNL>QVi/\P?Eb0 /Y{`-}nև!(wweyqrGfס/=ܺ=ܴ~os"܇ {<]HPҖͷE%C vNbkT~mr%j kC/?4쀐f퓙M(8%Ź?UT`(]o?%J-BS^/}̨fr ;xK$?M crc6թड|(GOI*;9*3k \`qa[|Ťؕ-]Prw@Oa\(Z8đb`ro!uh)qbny59QI/$]Bx⸕Tަu[CB:}']s(d>ԂsY\TIO/)܁h_T\()%qn]9Oi)fLډR'ꀤKE7=Ęs'-^>#= _MI2$9a4$vWs뇭*9PP3Z!Q{<\a#?|\tа(Nqm+nZ=wRpܸɛŵr2=RdSϟNOÒ/ WD"WlU~me&w&i-9# n/ht)iDn`iL{0_{bT+ĆΘ P`ʆ9diu>R.ZlȌ (S?Z5pb_}2[$l^I4W2nJpa~55>|ՙ47O䤚lzJ(%im^^)KG]Ek&_xwk,]Y3cBNM}p) (ujw3'|hiE8|4@=^SSL,h rs+)[ĂkZ9c,3iBk}dVLbx뫮-]jP]*q#9~I9F^PSfN#'L=qQ/ԭjb;(#E쉘N}P/l'Wb/u"\kL:w p`<k j/﾿%FoWS[^w~[8;IIŸV9>Q_Q2\}CT`Tmx䛘gzƁ_14"7< Af2OtP'n۱ݹߗ|ȶdS(y;~ɡVuŪb+q?H[ui&Va)|?>9p(BN{"XKP6tc  :D+viG&%=h@6'c|^9 A"LZŞjm_89ubdO/[Sg4u'o s*<5:]g4vXRW稧~|riXP>iӤuQ]7{ v=8JN|nIs/2ol`9<MuW/'*j&t8[?d6QA2^в@8QG @(&0^bx%\L6ogb/ܬva^O?es=TG o_-Ʀ5{R5 ĝ;'_K\*F@2#~d5jGDN2#$nUmyoD(gzT|Z$k97b|oRB'RjI)$PW:4̒Zd1@+ǟI㉪1.;U_例qo[em/?YqE/of\YI)Q-*J# 8DP ?7`v{s0IiYIB*R @"A*4R&= Lcnzg }A-oE$+)r16&_whq/ y9GR")0qm~d5Wqah2Es57w?ΌϞ|*U}7Н*m[ۙ!?m< Xk`Ϧt`7?n`֙aϏduĦjp˻2y[]8-7vg]>ݭիND{US{t6- sŋe}+;mIn w1?/DgQiqho*ΕQZg./> ^O# d" 6N 'MG &TjgFg ً!m|S|MQs⥘[_ߦg3 /Zǟȹwڮp= ZߠUiT^̰ڋqa{>%IqyV&{Ӎe}vym#](qWadpB- +^g'xD|4C, Ю(]4g%Jmm_Z=C<S{K](KL7# }4(YU}+'+:Ss]ͩ4cs>˜7 mL7g) 'X~W:XԱף]rjM7|eft|4ϵjouN D@J,`A*qJre<-kb=Q|,-;AXxciƳ`S!/;U\G, Dv|hN*^x^ߔLueb!@"TilDT: P"J~ Hَ͑ aJzDP4N(SAN"B!#i B (N9 1ƀ4G&(xSս𦲭!E27)J3Ùi9Ƥ"00"iD%A00=lY>PXRbĈXEDIG L$X;ϸ&n\Y5 0P>َނ>,nN8A9ak @Iy%>$5hs}vNӏߚ,$'u'SLQsMܧ>Nto*%h4Y! #`ۮ:SZhhhihΦf.wc @P.lǙPR3H1yla]zs3\g`6Yk/j0V}?9z,yd܍L{> W~nJPK[RubQI [IF5|ዝ# Q!wU}IƯ5s*>ǀ1G^;\1? wIN嶶R[觶v2D WTSL%43rM143걪Epw3#J.fFm̐|̩mZ͹*ŁjdKHj۝dbRiyF;۔яF _GvQM-G;r :Xܣ*/u:n;]ZKGOQ9%'gZHΩʁόd/rhTSgKo-kuW`غp:&(@Џ =T'XϽKҩ /3H.iI3 €?`"5̪da\yB"uqqk uu $²leGJ*uL>m cJp0AH1!J b/tSrUG,тJm⤛mx)QmUIm^rR鞋 ^[P P.eTD 80BIDK>8rG&{^D;Ed^ +1:u }Gzѫ{1r;9r{_{ɴ?V63%pgz]Q%}=O|>N^Џr⋃Ηρpʡh)3dvcSg7s"L#TˏDuAGd3αݡ.;͹Hݯ>jgYr =l2q Van&Yڌm^wbpQu" E//若n,Xyc!j ;q#nPR؁c!8qzwF?NI: 1͌X@~.Ȋ;\8^t" ʪz5:,Z1e뒀 ڛZbd Ecf<:v2;F5N`p9=Rq6Ru_M?4E8Aa. aKUpvM?sKfଗBqV^g$<0bId( A8$`*N`aclg1h3j{LFk*oQ tY8gЮO6v;S5,M˰&ϝWZ}f.т[jS^yՒ!v90 .o]8ls_Pڎq5 ԵbXBzf;'3}^Y5xgZ5dikjZ3}%1n=GOy|vT C&/b#LYXQ{yY!D[r95)\DkV_gOFD o'? G!Hb䧣|lK=Srs?ڳ!YU~{v;P_جs?w[++?|>\_n͋zOL[!BR SFC,X)B:ERR7H="4.T=QMÏö d=6xj"IREiyF0i!J;Q 3HB%& 40NR: .kO;ξmb S(bbH,DA-Scښ_aԩɸ_\凭J;S\^U hkG<$)˔DY@P dDAFE#H& TXjaY*uD'Dd2B̲0QPp#1 yqX@B鶦mt1$PS:>)akoU3QxzYj>VA@ yo܎Jmɪu>;hu>ע)gv<7fa6|xUVŽ93z KFIQ0&v(\[kFv2'!7:%6BX9ת{2E9xi$B,RG0mPuObMfIL H$ Hg*:PIфĊtLi(fXyzMR5 (|~VnT'Զ+xz9UwƠY649vQEϩ 0ɍOxVej*po7  ۛ+XHсD k LTgiI)W<$)&1R18"\RK+Utk3jD8<ԳwU>wӍl~q>-Fy+X}F{]xyS77`g4xo:뻬<.'\TId@2@0c@8\'  IU&Us嫯UJTXNW|BؖZm53mX:`@|lWlx'6p{!Uݰ-U拒ʊ\NA'"qDLl 2"1"&QP{mZ-+X"›#@ԢS‚MRNEzOnM'W6Gm*##̟|У}o)=m?#͸Aϼo 1u)Q}ٗ fWuAk ,,`8"o=x7 wu}]9H1+j^V@MJL ͪO')g))Y_v˥z|G?&?K}eP4O6T@EGZbf$ͻޛy/郱һH .*fe0%8SE@K93'C 1A됆1@JwtrCZ`0ֶ[-fT^=#m8tRde8=[V1Ua*sإآR]d-"k;,uQ+S dyWvLN\ b tFD*i@ 1$ap]Npf^Jaͺs#M.6]+ޞ#z.IA>RwS =}|J~0|@[aq fiZBia[[VgbTn9N.w={(&*89<ǣ u c4lBJq{8j?ۊWp]irjw$κԢ>k4 o4qhѺyY= J9"Tޕk YƁN 3B;ɫ??-%6쏿)9= icWslMgi>։F5]7~cdi !(C+A)(/| _%_zܠ,uC|FO{e'd>njp(JI<_0.O?q aNYԬ@8%~yQؑ@<`sFlܿ+!Vrrt?O/i%%ѨG$exO0Ivvs=ތˉo/ryD'Dd2m^3[0F W9K:J0 ĄRYAKKUp ש8޼dGiVxRn@Z35Ft+/:lh<[o-"ϑԒWvǸ̦vg2&,z9sUhey bV˦:gzg5{tJ|W)Qte_"dtbٙ.|둍RŊqŊako׳bS $l]tN/KPJ\D}d #"~e3 H5X^|y2w}pRLKWӡ-҄|"!S=gFnϝVԘS.;}`:\dv1;1"AvH{En0hLtlc4?U(Nh//.痓qrrV$5]"IVۭ{_IΓX]Ǐ^JXjxg8`2Jo޹-X-2dGmU\ET$f,&X@ @MU), i ae86"h6z縲% x pgLӐ! i1L KTZj $4p͉NܳZ7P[w@ |A/'ҍ*A{PQY褈@VT lbWU ϲ('h`{@ ;\3…hB8cBzEI4&!dbi= +y=u6"[o=W3>^ظx4|Hp4EXJqClv4% ~>%v6}ؔ 4J4Uh@ IA5]\ٚa)wQ 6/_z#Q^T Һ%lZRxY<,hŮDjQ@lIY͐$19+k9M209MfI<RbciP 4Cـ@IƲUh _r *#@Z_ ~u:pW ;T,dr}>$ n'S+:]b ݠsEHy+ۏJ;N>~p<[9լAsTT-e &a#"{hѥl 5@O3;۰Ğ&CT֔Xd[MšD Á %]I\җÃC L}3Bn[;ukg@ύ<^’5\LJ?jnK&zYGVGVGVGei^@dzA$3enpfJKF0M5JE*ex#`,b M0i3Q}*1|yކ?UUv%lBZQ**py#Ʃw[ś S뻋sq}BRNcK5:L|Fh Dʔd:yIjX^3嫯z#$ },\i>^"R]jp&"Hoa 8Dhsf%6VS (G 2ԒL~I;P!7_5A4fm"a9r5=8R1a<;lal 5f vEkƚcL?v)GkN句ˇख़cu zk1'qHl^9|̗jKXAlhq̓udqiwSix>-l36IMqKIHN`g(pǁ!ܞsҚ\qdi p`L0E x!ǂf\z<Z6:Ud`*~BJvEףk ߄aU%G=cŶXn+n .>gw [tȤYe5Dzuy$Z.M(p(܄|6ZBW|9ˎXo6?7T_{QEތÅ"fAG!c;=u'9C=҄ .Qv@@ n t4X_iv1"fA5bנL~'2m$/e"% BJjbqr^_1MVjjB] u ёB;h8m.!?M:-VY><[LE֒CaU@N`,o7b'+A8]F2՘@E<MPyӐz5xk}Ԫ@Tɮଣ!Wխ?v宇ѻxZ(Y.ydn '8n<'RV Hynp9U8Ġ& ϥiz{8$넞sE3 TzZHu gh:K{э{W/ p̡҃ z3#ߐ|.3%lH9'X"wdnK L;Rٳ߲m;og?ۆ+5o0N~絋.[ {@dn(mB;Gӗ헙d#NZmÅBC(w' ٜs^ȥdnAΔ}摕摕摕QY%sL-I cŰ ?{OƑ_u؞;̧BҐxHjl/ˎÒU~ޡ:F͈S\ ϰAZ።Y1<&f-}Qҗw&op}\܍]Vjɇmxw]-& lH+RL-#kajw=HS77B E+ '#J ﴱH2(A!ZOFd4,baZH۫`bfq~$ Oeifq~uiϷ>Odr}♇gXpð"я+o%9VۭsHvRE &fCeO,H9 "3Ph e-U֡ B$;P-U.2/3H>)AiATd@)d;`G%bJS+TQ >JGFFjI{ HÑ d3L0:ba.#TeC g>9\Ku>RFõ??SK?r`_ Ϳ:P@y;If[(Z;G$yW1pncEVQF{[I mA8T #rut"hjϽ\F2oaCV~ڿ$;,07c WQubcӃꙦ>c0PCw u+y׶>kqS-ëT;qF{l'S^]5(oY+ c5^[5f||}u(Ƅ!<3!'Û_,a.S29Dt=/.ky0kGB?|ADv;f rO#> AQ\@ s<[ƱX ~03ޛ V`t|@hx~JwlV.oc?O:7U觌)t"?2hBzWhzx/l"½;0l?18 ,6MJqdsFLxEÌ+#Q 9ATWh"> kIѭE n%ݴRJ)>`*K ӛrybb`U@~ݕ!v: BYo  a3i|4|IxS ?ѱFJRՋlN/NItJIȠ,s$qq k_[ ο@ȞC[:_?-&FCۛȔP h@('"G-FhpBV )}G6AIJ>('bc8Aq{VXP2m%?z ~ xA1@!/ŢL(B\imΰ#OD"+,Ȱ[B [W20NE 5SdAz?!zQƒT%{#bUؿr%55>:,aQLqvCl Q &3#! iA+FE0QɸD# o>`H8$1$Ŵ$l]9C3IQ1^uWP)hE7^OE .6G}$*FC 2 D+S(x]S(W3][&H34 BUY%cZzEXqν:>?dhȄ6'0Yűq3W{ zl.w$B(z4"DaƄYfPB`#H'EN~904Rzg#3$ՔD٨^P"唊(59Jz'Pҧ>HF%`OMZJ%Nb yVSşuPΎ`)CHl xiǍ4S( ׆z7hhӠ֬YHDD^^Z@~-)_9K>^l#,aH? _aqLp+MҗZcuX-.ݓ-s|@[ƭω4799oy)5sf)hѓO |^/ 3!,19 %ȜaS9b, ˏ1H ;Վ󱕗VYb RZ I&"'8>2 9J xWa&D^d̢I, s Ƙ<֏$?fJ) 7b`ZzI"#kI1RHāE "k^\UHSg7KRz_Sv܁c=" yPr?/Xۗ RU7W'r\m}˟yy grmlX6xlo3qR/x #j?f?s#ϔT Xvm&z+J:VN290J  <*ȕs&sخIPԠeQd.3 p{F$bw@*#Agv6]KQm{,I)u:2̸28dE" ;i5{H{ZGQ("7|Q;p%1X&H%EbZ9mAkwzYYIHQKd|qFF(W4j3Չn2yV0@F*«.ÐzhCv;l-#BRdx*kL+Yqs/H] sexH j!deg'8 bpLaOҪd yԋ"o^БǠPtAHQutC-i̱hgWji9ѣ#5wBqpDlPk(@aP-)nR'M(*ZtG7I0RGFM:HXՅ9u"1GGU"X᠃蜶 vYSF+!- @aB#g:eB<D2 LJC -)ϒLx;YA`ʓ 0ɬG Qct0hMakB1%oOnkut7Wi9"$~{,KrӏG_l>(n. &FP)KVy)O!_BǾ/ˇ @^qazFSȗy0_\,S3<;K r/ܗ8@4:*E~kvdls(Jyso嶮$_׼~lh+e<ɗff ӑ|S+Vʑ#LZȶ}V|z1Pj$jof7(~  <"X&4$sWtGӂCMZ4%t'1H`J҇,>~E)!s(I6 zJ%\z'm mExtx}quNh+Rh-nND~ҝ-Ӈ"G*XL; EuJ(4 oMH2{j3J{!M#ý36{}_+/춴f.(:x ZhRpQjSntMp]VL~MJ`m<^:P}*NpVZ8i1~~SQU\ =Pd"q꘨l (mo8)٠q2 g{ Ƞ:Ufpecj?5A{1h,`e6c\=NTvGS"dۍ2&*݆X]Sx:%&+r74gHn鉮'g^tډ F̅wTBye[5R\I 80mh3KJYo=ݲǽx,mGrrigs25Z!֪뛮w2Q7;__/GJe8">^5 lwַLE5GәJuϷ{َ[̡qvV~<>n#NK)wY&(6*ںl/μG1޴UyӰW2-ч JQAgxo0"HxZ []&m<5HT&ޠ tΐ?\,-.WOO|.:\Jù&uT+Jo6\ pgB.35 LeqʣЎh"gVܸk*YK:-__ C_?u(e( V<ĪÈ1m)L}=VBջv* !%mj%l-i#ITZVlwaeVLFWdBr&&!aRdg2*cIR\Վ᫯h6߹Ps.Ef]Z֖dx+5]Y5Z]wgJhd@踍`" :em;l6lۻGt6JG K$$Ƙ$&`i]ɐ$!*)&y>GR@N&휌B<r6Ȩzrho3@"g#v?7c/g?9 ksJkvDonoݳ?0>?N͂ ?K[?ҏ^IZOz֓z@X/Up5U@*IrýH !N"8 68c*Fk`|^,.ksчɏu 0 /fyR/,Ojy΢,I{Y:/+ly)UF"3{e_:($jਫ;𓆛]?)2qWޛ>7TW,,@3H<8zlwZ! j|rRerQ|vS耗} 9ٺdFo%($v "N*E<*R5˂ xGHT85En 79zij6+`Aލ_KIA 6W[&j-#8la[JWFPzאp̪!c%-Φ٤8ncalwp8ApƘW&+"\͙\HS&o4w~tDU6 LV$L 96jHQI}tr*8 B4ØTIQx y@#тvU=IhBVUPMXJ.2$/kl]L%KKP7B1EBCF~Qk/mf̈9ۑC.2&oӟ%lK[A?~=1Nw&!$ʺ7U3|n$7-# {> Q[x-zv~JhC]Kf4*~h= ޮVFJGzX~WayctxJR~(] ޴j+{e&*֨6 B/H('0;3S~ 6>MmqÚh%"_q' ._|9y);a8\{2NO%$="JRΐ;bh vx>_fY66jL\X¨-{v!E^0[#z+?K.-I%<ɷ'yKl"u  Cl𕒡XdxђBR>3@*,n"uNMn4ΰ˄!(kf= #=my\`[Gâ=-)43 i/g|'wЎ'(-Cs.Aq }Vo8̓vxWʗ T%4y^ŀ G(^$gMtQJ * Rې1J樓$eЂe>rqbR < 0PihET20O"Umx,IE{35;Ep1~ˆNd403rQ%EkH1̳R*i1rsIqLzfI{:DLҋz;rU03d?yE"e*AepYh5r=EH"1iy 5~a"\|DCd$Jrx*ýH\IARfCB' ؓ-ڑ_=>EGv|]0)%,pbuH%)/9ZDi bf!jGd'1"U35;ȉk:~sڟd Eb?[~nqI'PlBVVdWRgtJA,`r8B3 ' e:(!JAq]\i7IB BS3h)WR WK0dcxp,H<$ bLºg߆d <')"FF^ /#+ + SЎּ35;EF3Oր9PnOު;EifkzδIq5&QH[EZKȬ/eƨF Kq&ŬMхbs(ō|J柄#0Â>(T&lKRpQ8B N mAF̎wzvsDCLjGs |~s@T!uFBeR|~+`VrX]N3yI7#ss *۝_@MkrDW*RI:)(n F픃uw\ 8Mly(pМ]σC5xzȲA3(lnЎ3/Q_nҺVSFT;CQfzw:``(!G;>*t!JtZⰥu@9)QAXcR  ~Ó:I^eZP,K?sN tQkx`Ʌ*7Q0B\\SsOniryY~<ϴry 3`]Ys9+ lƔpڞ=ѳ3O=)s$%wUUXb:$ڢ .e5+y~fj1ۘ~ҧtf!l]׻uM*n|o 3@j \7E=pdP9jӕ[`JQgPxKtj=kAo>7J+xLQJz2GԿg&IP0mdϣxURi. Sn٧YEL@K&Ā-"ԃ3Ib^1ys촁J1we * &iP ` %E1C-Eyp h~|ihyy0,]qmrWfrc${nDa4~t+8\ǐz=\\6yqS!0 8mSHFYO$ZS{:'1Xs) ^ @qDQX,EbALQ_fsNf#)P_9eU|n3.NN))xK`Rm] !F $qJpCP 9'BC@=V.Zڵ T*Yw^/5,Zq: * IY J#7\zBj>i 1gD2 3B"VP={qg7Wl76sZo Y|N07wHj[SR3Ǻهl֙uk>-oaV/ċ?V+y4_Q< >VnoܺUt\Tҽ=}qW.rAL}a}uM_ҫw!ܦFeu +ǔgʣ[^:HؙME-bgZC*Y^/=/CWzs6_b~?_nM(X]D+QHHV\+1mرmr6#>tmLrXoϗ׋En[F&;oUlVwM~;Dm& T `@IfeRxR {׵6UƤ!) bfͯnʧ @.@طazN>|[?nVUubEBwy\$̫]7Ҷ_jp1ę|Uh}ZuӞQ,9,տ8B{͆\{Co{CBBrf֣!N=͗K~|>xǕ,]p׵}GxQj΁ |98mE=;ȜrXh;A " z=!fv* ¯ʵM׎u `}xo6g {bIBECԐ; ,>eﺻC:Ͼ{ 菳o"2QV!R8c gk2Zhy#]@p{j+84E` g ]`$OP 7Hjd]UALS̠3dAm0$&  - &E&\HT!ePD8kϓ44$fZX{ɨ=&Vw[qƙ'c†>WguU}bf#w>kA͕2,YvXq;7{QDz~^%Gc#$AeZ$IH刺@~|Q2 A8'jԘ SH=SY `oA9IuW2#v§S;'b{ 7#m $v+%4LEgmdNhcޔg 9P4+rݍ~v~.*٬»bխscynDPX H+!DsI0  @U@n SQc(NԳ'u-Zz4`z]Y55z gHva'Y-FVA9KF7jE Lb(exc /$̩S_)hΖ^ AB2F 02ObD~jP-0RDGju Q+29=GA8n k!U/Q1HAb_kpV+H(44;<(w4hd eÈQL9gtUle*WM`QA#NE.xB[$à Ba)0D pYk|y| HCC) @XLJA`Z D0C8-jN `鲶V&>{Yq`,D1$L^#ð F-3E`8#rlV"e #AAZFLYa@ TVC"ͨx[;w6Y7&e&)JA$<ةZR*R#LVZ \ib^g ׿v = Zmfa'1a7@J`laẌ́kHSc8Ɇ:"haѯ"=ݪ:b8es&\UP/m| %ͽ|OUغu R%]5J :+ˍ3 d?U k[vd')`{)(S&'W`H^9#v|Xk-/Eew ~C$*ja^FR<%::1G/P!D9f5$^0iʼP#-(Z㱖wf& Xwc͍$jѩ{ c*y\ +)/UPIh :LbĴ kMc8yA'nċI,qVn>>cN['?)nc=eM9w&R(cm3Ij\s]MʰeHHaYjRd49[%q:Br.Q\Uȉc7&P-He/y}gO0Y8"/m{0f)r^TletنbxmZ'ڮx樿e޸k߽ {*s]Ĺx6ߘa>”ᖬf5D\Nm08x xjn:3uObXG'OfT#behJ}Zf0 v[cpB lmV,\觃-w-C'ڒWF#}xDͿ#g#NN3)!.}$=VG("&eay<8r#vO~'mX<&{iK&fm~~cw"dwlLރ'OzяqvԊ >5jm&/ymzVNz&9QbsPnL, Yopx[>N6a JE'ApZX#&|3|]mMORhrP0 ޺6 ϗ=m KjCl0Z%( cXQT>@-jMHmD RtN N)B@p=Q=M@գFͷXh(h?bz^փ ;dy KH7ߵ6 ҹӭӹyyJj`;Uі&_?&_o?rW|\NzqW`WMޗBWB*}=me2M}NU_0)M˗AuIZGo6`2:}]xk 7mreCmIhv#5Dn*T[ Pcx .}l"'y{mo[U6h2ڷJ!c]j 0,Z?8v:[q";#p}xY0t}7nyȦI8H9~^m.띞*3V3}}] B {_Wܪj[=ptU)u: Z4a]X6usuK~_/ rވ\& m%)bM11-|ڈޚ- xtC qƺB_N `{_FUB(ugDPE@"`fN>/p.V`?$ e1 `2?Oķ͸.؜`,b9I `lɀc;x>߿\hbPxv1rǡ2Ex/W MGa*\hEZE9K- Lr$VJ2eBXdռ\O,ᯗE4j;۹_ "0[h 3_9A!x?wn%+E`EFx4+*5tV` k(WG'p/O O;¬󳔲B \ sß2VnRwr>,[,`\bN˒4 Wn'FEC2^"ǎs !"vaGFb'8pʥ>"y4 B9В21,R&!S"x_TYZ1F$X"\֋[>(~&Uڡ~W}en?*v ӎޘ_g _9Ra-yC?a<F9&_I%(s8F-W3/ u܇@K įVxp3{(R摑hE \ͪ{Z ZvyVKc{!D0HE p"XR__"%ʒb&*Cr+Ԁ .(gL'P".*#_ (=ר-_Fw( m<<[(5 B-J l/rY+TtZL"swCN}z4-0т٫\1J06aa"uuI :ޤC'`yxQ,5d̚ ¸/?"̩*ݩp'.76ukZ1]cR)mcbXeC2VZwf飃cʌAK*p aG0V{))}rGjVX|-5( p'ʸ8 EO4*>A Q!cDrmb46sC{k2{I#Eg 9*17)ByuFF@R#q]h,uuLBbB<ʘ5l)uaYcVJI*J`eoȞr0GE0-. Ȧ:^eZ}4Wf~v ?OV)[]6^Ofc,g^,e~f_'?w&Ny#g73;.<\8oo5up‡^_K7]a^rz$䙋hLꪗn$u EtrǨbJښu?HnuH3,Ru#.X\ RD'w*mYS![n MnuH3,52m1nH) O떉AdQɺ-c R$ڳnE֭ y"%SXu> n݄dn$NN6S٦u]huABf?-[n{nNUO@S[n MnuH3,֍wT!mNW)[BOjg.92%|͚ucNn$mcTWQe4.4wABfɔ [nu It2ǨdXyU)ܸu]huABfUvSkD9+?5dtS@wiM=*JM`Cƥv}EY_QV&h,I7*H1(樯>꫏iB5bݕ*UfMR}J_R&PFp7*U*Z;LQݍJL' xEGOT5TiiiiYRd?Y4g[EI+8)pKZH)p81ư`ܫFTzq%M Agf))1 \!d%B^,UeBX::X!1-=a68|`'!;t`(RV3oC1BHbJ gʫ0&`A,;X!t42ERgB4Zm#vs0eB .FxHXE8 J,:gr\i"ja/)dz~?.5ir[Ѳ4 l/j죝/"'о6(qXF 1A޳"4x%b~|AJ:wsMD!sxm{vW 3_BEղubņqtR.v̱s&y鵣 x.i'Cx>.Qn 2L٨1=5r4&`O&B+@L??@D.>u9aK$d7WuݪgNZ.u)n,Z,P4؍.M./'㕃y~஋eͳc~RF94K?BC5^a,)>0[Uɔa7* ިF#) Pr;/#Aj敞kլe5+xS=U*9wm%'}9Cvxie3 @>MnfgR?g4i5>cPĖ7^谊^1r0 ^}q{#ί6haxpT>ƣӱ?9 @,qZhyg+V˂qOx{^Uu]4t|Y4.؜[a^/,EuʆWe. ӂ 6j|هtt޻5 &_P 8w KQ} ޽ke6?hW%zwT\I$.%7:YμY@yLZ`\ҵ+;$gǩb1 dѻuQp1n#Δp] 59YQPkTb FyljF4yZ|-9.v;Fg"%3 UK`2kOk*&:`SePFEGϚY"فEtI~(Qw)p|[2HPǛ@ J{hӺu>1-K*^9p. D`u.c47IV 0d(v-+͵'$)mQ͒ ]<1;.UsxjI2;$h{;R18O!8wȈ E ,/ #/3xƸ 1^Be]EVg9+=Ԗl=7fB)MYd>M?%=~~pWix۫Y) ǍS8Nҙ-avnyWE_}y𯗧9h/8γ"='&&y(~dgE\OD<Atb!TE=D ;%l:~6u;v|m Ϯ>-d 'y faZuaZ}Y+&r h.("b1P胟+7?xǁ 4 o'}ћ\HDY4oyN6&/4p5te]]/s%~EɞO:m_) U^"J ϯwKg"L1dH;?/Ch:]:ٺvMkZ]:v]d+R[[&ac])X64D0G)cv ]CrVHt 9 @6GI.OoΞq[d&s^dYA$a,ǘس)4%8`|J ~ 6,t;d.ݱY謢coN^QuNOsCZryQ ZМ&o)[ 0 [nO}{x#{X!Tө.E[Jq٦7d$, O[[itkb/.OWemfA=t'fKKCgMkRt믗o&(Jyvr>TڷHp_#sX7%rxO1Ǯ&gG,Wt98JAqј}߬x?_t8|P-)uϻꨣ mNֵER7UsjlԠgX4Pr)`جxRmNV:z^zנttn]rTյ]ui [7?K" 4ferS+hM͊w,ઈ_w_߬g6klH6H5RӫRu- NG?\\3>=RųGOJ}sY:Z75lm,e1RR#AzgY> \) ZgP;cl⠋ ƤTo\t1{.cmއsaIBͦVʁ ݀mxoxeb9-5֙wEL8i!\lJQFwZR mbVך!`7踒1(4O4@o{YrJϠy51vG!YiB}QhRZ :JYɨV챆li$ ERO}9yv1L&i|.XTqԏIIȘB5) Hd^; bI:bnZ˭I-܌m˂&N6oƹzyI0!P`96Y~GKVb֤uNŲP%) 2ìnElLv37hъMO,@rS#~ p md4g@6I;`C  }ئ gSDz{9T{WZem7Xlh`XBNuDg&ⷋt m:YiYlRS.9IE#3\V\A~>ؤّ*RBf/&4,!,cRs5>]2#{uyݚU{3^kSJwUWdAS|{#ޮ&y̽>ִِTw=Grv=㑛9O_ 5~ 'v"%c!?WJ)ka\c'}v|,"g!P&I(a&cAl'ȫ"00Ii(J-N>7R(0Xw5^!r [4<%Xڥ@鶷.N~Js&/G8m0ۆm^ҥ7M\.N(3gCẑTvڏbjbQ<*g;(|{E RJ(:2P$70N=ivZtoeo.yn}4 Sb&C0X;jMYOA`:Hrƞڟ(\Cd"†թ| >rg}HT`$Cl`-Q Q@tTM)$Tm,x6Oޝ]]q1ns+Ύ/ l?{ƍa,Xi(,n ֛KͶg=H;E+4VKa-٣$=V^,_c(c G VŝMnKn u1ELcA%mE1lW{X.>x<YGű?b+¼ eIkӾ(K" sNyjCВ1 6CB.VY?.5deX([-i HOyA{Pr]@utYj 7 JoBG߾?o?~ce|4C5z.^}{20a/*-tt EOvz~uΗK˛y{NZ]hDI,]yYV_`] S8B ;WW^Urֹse)TPk™${A4S.h0wr&C'\H߈IzPb)R9+Vb.=-աr:ilI n $H -'v\Tk&,%؄9Q2kpB\0XT`LҒz A0Xc@1N1 V"0\ ۱i^7$=e a<n$!brVQ/ y-q pcwe2LP O8q*Z Y2#N8T4Az͂;QPKu)—YMX##Q=\x4'CAQ=`ҙBF*+˲@ x `(\JXXR!#h'H+W<sqt<|yu# .h'pMw2viAkl=xe 0uGGw\D]orJah@7BQR5Jމ{F*>brKEC `7-Z( 0ƫ[!><]R{ JA PFD>  U]M0YN~^nׅP t1໻*Yjj)ĽBidכNF?Tw0z^?٫{|2CISBVRcζ?= FұF%}d..;Xrp}U?b,mOxE Pxq-nD< `z!".z5ڰ {Q-wieT 2$t&mʁKC w Zj%cRXQꫢx9 .o4[|gR\{ 4_:HG_2=tr6^i"'m3]0rCzGvZ _ge$6T6#4ݕVj:xi@}x)D!3B|?}9yP*w\ *o4~V񹲥 [ uWꈫ^׫&M~|foq)M3qx_ ׮(to oh'stV$v{] \-P1{p.J" kltv~,W]a2{0$SO"^wtPMڤ~.?,4֖=C^(TAR}ú饟ePb{ 7hJt(C}.7t}@E!8X(J6 ݸʻ`ds2GgTa>~\} >~~<chÀ~x}֧zޥ)U9 0Hs(/VF}>EMh0ٓO!Rb(DS<4B Ig BBz>E:ԫ" ӔPA4E͝Ҟuf uOO5!DsCF6?m.WVv7KHgPPE$`1UkVG/8-}/.S/k6 W_Q5*dvcfP3Ɵl"Ehu_O/o̔1Wpg+9mqsL_ȸFZBӅK^ jK39+ǫ(10sz5ZqQFIy5h!47/]O@Gغ;0D)f?M,UK.u'[b/ MjKu`P 3l ^ml#!)ɼHd^9&B(Q˨%G('`PkYmTI  *4,bщs xVN)OaN`q3sko4$xA 5XHwVlt崇d/hPg۽.| $0$h'"wֲ-(ji6l`1$F WZ+axi/: ';2/ˆ Ӡd>q:#m=XB[IްKHO2faeи+[Jn"B:QT-KW$B"W,.Nu Iw(SE%Z\%KZ)eR B)9Ll.Z-H9J0J;0C.jf\rx5ϊPy >lI$]=P Dbնm/ƤZ6%1hƶ(1F}ܠot2gE[LC1)]O$ծZ>)vX $M]_H1N_W=9/;IE#>=Y:F:ہ5h=NKvӯ$b WҿDB;QzmCS+=OdžvYcd=]m|r5?)j%i1+v[w7J3oޫr ٱ6N [27cRAn1Ki\K]:M F@zJ[IA+dʧ5bU#fa=Я|zgcJ(/41 ߆ި2pt 6"ٟ6֓?w ɟ v98t367bb Tm1;`Ckp.2f`!L ײn}X8PYO3I;E$:<=4\'~o)\e5-~5.=q5D5ft1! BwadLpIɘe@)*APЪ,72˱LW*R]~jl@4 2>54ekwe=nI|bYK30vvecϋDR}d˖Y$W3ul!-v1+"Ȍb_fݧU"Au$%gQ720DliCth8Ǐ.u{w):.-v>\|] a-7rhx D{|h+Vs%#u9}y<;tv;XvŌiFF0(Қ5\ݻZIɸNr=<}5(FB|}wP)M a˗7~^uŖ!FOݚ+`FVx2b!OQVz=ҍ6_Ocxi'/-OJ8r糗 Rn)atKK\1BQ{<2ʐq!EcS,,NzQ޽yT}[o1Lݪw/l>ژD_>GT^M?Rd[O{qD.7Ҽ\7Qnw!#-SeѶm:~ 欭U>w-4կ3F]u~w^F\SW UQ\xU5zpyhQQ[aeBjVq_> : 5lV% ;$~ui8Bm>:Z[RU>* ;rdl(ZufEN7[<Ȍ+[㥀?.-=uDvGx|qWŦ~3ڱcâӱ9 4q:VDk*OǦ]Xs;̌C¹yV9Xyڑ٦tKiEڭ_CZ6BƝsNhKΪ@@J跱-' q{372ِHRTjzjgS(E2Lqelz" BFuR ލ^Yex-H("!% f(Ĩ\lC$8(}`[U}wp4^ב̩7JcIC` %t#qjJXl $5Ɣͮ2Nɕ3I$y8LNI7S&KIyO9 0gxd("wxxR:ZM})YKx* C>";$M $D2I"cL$4 B\F26t1U 3I$2LʁLnH'Äboxc@%9ψD{`U`9?3D l]Evݗ- J%_ J?ē'IO&5qYEs ^J'K"&2؈T ~EG뾨\}r %-k>93BԺa+D$Qfuz=(Ô7<A>XL/ X+4HH034ljŭ #j DPW'$&hqH˫>`Z9͹DHP9fb,rF֐?Ćo`Tc2L XطV<_Wc(?Z%y1#pN:m@`/yŃNVu$A"*k3NagiFČ8G6yA$ #FEktS է/{&DE`X)0h>(D9`-dQS0Ӻ3b! V}4tR,<>Lwd>5*"C 4J*a/|~1M"?=yna1)rV45~o}x;NFɿC \Jҷ77c0>I6SI% 7f6ӋͩTFs%84]%SkA7aNC~KlwRDa]C/RlA{N ¨-t*>(t[<* zS)gdN7ejj4F,>O0p!>AcA/.u׆ۭ#pxw^tb0<-k,q( He\AK`M܄%x,1Re.e|ӗiպ/'~ \\;E"Q}\%d(gES߽)[Q{Qc[;z џ,)mXC.Az4| $Ϡub#F9:hJ<$B;m`%)'*ŋ17MUN#*vhTqaZXco/rl//ᠰHK[iJ^–&`eѦ1--,L=OSb۞g.37VRR)ɵALi0yNCR(Ku!ɛ[fCH¯366m@zo޽" ՜vP9#*)PԔ^\] 2Xx0QFils N T ҃Hbh#{@󒒽@u'f:K/A1roW >Jj `'9xXBcxðFx+.75@[:)OGA չ\̏R9ټ\K\.hHŨHd[횶@)@.2daqIWt]H.Ʒf >ph-UXjLJ^jIåVqQ6.'Y jk\0PTnUGeƘ2r^i|Zs|4x]QM->x{AK GR ]Ͽ}Kj@JYdBm;ځʺ8Q =-.=âZs9(#*F e tD!DvVgXIY߅MO{ /]^u~k^_#r ©,My+dԼNܢ^MR`<~פۥ[rZ6ϊ4| ͻ I;9yaV| w6?'ItDԳOhxShwςQxh}LZ"tEnمħ:R g*y Iȟ\D)M(';Q}Fp߽?6SL͞NPq!i~}Lç5/ir ަ5 >Dx3trŏϷ~rw;Xsk!+Z~zwoR'QUózdO:*tї(ڹ[& -1٫Yxnw,RL3]2{AlnJ_Fϰ%Zn 3$Ƙd0Ia&ä<7m 68XJA]8N I'&`Y)3*"oZ7}tLgc.6]6M*$\Dge)X$#8ZDDF(:Sq"Q *T jrگe!|0.@Z&_1WtԜ[`gPbLROq|R]kam*'Gb 1: o!_ٕ `ifvm7-3˜1IIdE^F>Y=7 -,n W2 \PJ{?xacIe *izeb麟/lU:Mw>c N`\~\{l0_r0)rd4#vaR1&vTjw*zCNZDcJk_ E%&zg</=x t M5STT!Z]-NӖ81|p)+y8Y {N:ĚK~ioq"ṚV\ BS*s_+0Vl%W9I9I9I9)fy_i{ɩƘn"R;M)iœD:!) 6҈edڗ9-W Mi{[gqSPPgJ A!=$@HKz56W1P@uDDuW:x]:wtFCs0 4Eg6K`*uT[6)U >:tgةV"&\Iڽc,l`q ^`qzn)ab6Lql(̉p0p*Ah' g2zMvljG- H 20`ԂQu4EK_-6ޒ[TNrjd=+ن2fyp B9Kϥ ס}&ǖS .iʄѝ%X .ǎ1bx-xp&xLc\==@U8Fh\H&dL80%PR#Nd-Q`&MRK р]be K=nIBgT^= lF_4D-I7HYEXMGf\iju&E5oNO65v]\<=@Įot凟:a#Kԡ?%`РKG~U@ȮnNR}a#*lOuIV]=:!""6ZIGb:oѾiRHI 9Jh[.QQGtjD Mݘ5BWT悶dFh4[W)y|DfͲdvQNU68<|)4WXd4j58a>^C|?|Ip֌=ҿ>kXPk7(Q}п1ܼ~J^s6qQ.Sjk^0.8R(Xz CAEz6CO ZD~m&rUnsޞ6BV&F}οTbцwv ò9qnءTj~(Sd @SޛJt5) ѫOF;.8zKx~It@sQËzZ6»|sn">z/!$&F5'<2o$Zg|tdSSʽr5V@%`PD1^:¨p@*4r)Ex{Z$?˝rܔHfPq!cϋ k8F m15oG>ERɠH! CUJmhڢ(8u. j6,G nѺ^~5% :L9 ݖ=V6h0Ft,0+./pu: WA,+wN(< ?a%b bխ5 2E$Kt\ߧ)Bzs9w t^ nR WU~ކ8I -N4ZuD?gww ȌP9ݫvI`(ڋvwp (p1XLtX(8QEݲ 3YlbY7i(+VK18 |8'Z;h5 gVƳ9~*G-i TF\TH|mH|[ZFYsSk`bQtK 6U0f }2NhN \o k@;~f{cσ$4yY^P'f|DH- j%J~ v&wrܜ1,[P6Sw׫﮾z4ty{vq=k0_l-< [m퍜[t\\SFv9 +uh\}w.m& <ꀌdpLCH t8q| 5&H/KgD % meSV xN8/c8 ~tNޥ}:%*aR48BTgDF 5~GGjPQBKSDQKA3eIjDE*R!P$`#2K5aE!Ĕts߅oPk53pCF50ITe_7ˆuK:KFoJ`j;L'qY4`2J\]NPv?ZL\|eOZ$5W5vk ܾ hRshxi|l?D%bP b=Es%BHwaYAPtc8ScAj.pFiaڍԢruQHM5K$5 `˙wā@ Npˆh*ԙT)AA{:%2Yd/CiPa'7忓[<8H7 {xq:0Y>ݫgN/isQ;8lA#Oj%"4d"B>ՍB'Htg;-`ϝB(4Պ,eMe"2oHL掶!'%~!wlyJHl-FG߸TؔRTT0`JE+={hcI8p>ViX4(p/gJ6%$ ܕYqRT2Bk^ap2x)9a32=sy{e.8hJ ᾊ$TM[H?TgHӹ]xZ-!:w@P!Cbhu&F`Pșbtyk-Z㲎 wHn*@-| {6k;jЌC+9sW+{Q^y=rB2"+#LV39?-5q$/cHȯ3YYV1Q[5So슉ӽsq=-ULf2.H*)Oq` E#1bD{q4"Av:eS q`RaG{::YaW^MTHUҫ)qvSQ߁gw67u iu'ê>=-jW̹+|=qA2s{'ҞO%y7I.>S\l֞A]pnx^ˆzp-6?7|wiϟ OV%; ֤7.Uw$l3NB5 "2 "FBmXi8u֙9kmP#ӏ)8-R^AkPnyJk A|-(ȉ(wpO=t( &DBGgI kO~IaBkGuATD¹BPHk(9"i  o t@q@#8h Dh@j. J" V*!m&ph+*Pa:(u{ubkFrnҔ<($JSj0[xD jG Br`M)uOSTD >!-<- LjmVc&Ud)@y 35u9i zJB7^iz' nfG#T\?x79iZ:<<%);B T:l cI{a\+6IT1\z^pD//ṷR$ Ʒ ݭ1ejDN+r0s7'lfB!R qhI"JZoT2 Fٮ%SyHh%i"].=)-\*(~ `KĂ3G}fg^y *OYaC ZsT-#| 6 /2+n+4*X{j`R7Kmk4`Fv)b^H:7dFC cx01Ғ~ED#M@k *ki^YPveM#~6FLIÁK̈l3~3#Z5#yGrDom|zʤf0V0A͆K>|Ț'g=ݹ]S@die,[@ec!'KjZ քҺNW8HG=o(ӋyO(aI%&Kv$gVIiM2w$ְa}AJ8rLcJS} VY>W,^mjOIĀj%;5hTZƒQXs8 ٫jEsҮ>iZb '<+-]vLfLiP@ǓUpkQUz]4faT;zܧp2OiddHREgz@Q-DpiPM-O)S/c?7>cR܇+%z7_Nc!f?NU. ݗύ:;Nqo4֐y `Nmm -8=L.@,7s5)9,b(C 㒸v}⒌Xv*)F})ۈ>?޵qdBe'ŀf#Ol6dPWR;S͋dSfwS$cY}ss(0e@O#Ft03 #9/y|<nya|m|,PfNѲe;^CJNPډ̇=Qca.9S7c5NZ8vKγԃXg|DhdzAe0kJFF/3nQ" w;^GS17Xoh^rĒH&tb-%҇K?.LU ADfIBV)OsTG]¾Bƀa0;#4O}DO&)k ^ i3%]+v]`Q]f$VݼݣzkѥKj S}_`Op22P#;q$Z/qA( ,0rSehҜ؛mhKGkYҖOkfz;(o^>[ݡЛ\MV&a}&ZC{`'7aG+si:s, Zdr~ZP +EbS'iAV;i'&s R^/ |cў`=dMMR lJhՌ&_p:*{]0| ^!aK Ȏ/ץHJ").Jj1BQMc H0I@k$ )E$)qRXVޯi6?F^M/zK7[b{xH`靽e(hv/ߛS0T_EXD[7'l[ ՊӲ"ۻ|^[Üg8k4&e ;TX"0#fZH0(8uC"( PDF_9z*pTJ- U^Xi;yl>F}Ɓ62㘱.D9jDB5xS{c LSB5hŒebƆ*ΏQ6JH=XR$3΂ BԌe^aF&ɌTuաT1^:]wU*-`qvDsjЩg @֮MI?IUC 2ZⴳX~` Io I@zk ڪ15šKf"Đ%G u/w`Ƴ{E c%0IMIpG֘ϒ'_C2K%&HhLr JMAOu,LARkՕq@`~pu=L㗏>ޯ~71o oQ 6֦Gb%(EnzwY0ֿn^z"^͵o=?2֟(RfI=jnb;{?5YT/޼k ɱ9/Xp=44 Dx\ | Lx8GѳHEpYd*HyAp{7XLKd-4KKW 5[SPjӜūpQA(NGk{q筺+0՟KWMGk /8wHT-1I9 b$(ΐpjW.~v0Ԣ s}ǩv*iù=ofb}f+rӪ)6wT8M':VG(j,p!3| ^ Uϫ g#P}`4%WovբB Ik:N}rk gvy,+&{"T'w{@mIXp% %if A1DReo,ָ9Phzŭ0nXDXkb E9eXOxYf_69P7h5l~ק|fIʜ?SEu4"T]H ~"-Ey]#fgQN3 p̢V#U&2'1uE|o&AWN|U!N,J{^\X 22O}n<;/|ӹn]2U#L&p8m\={ 6nAA!p+^5'#]^ `DL'U* š!Zs|0aEKd =F1& (0Su1+% J6>ʤ:uhc;NC=3ICpyٗi#zfݾ<AzױY̦3D\N$57 -0*m9 1GR}㤆e(pZֈP6{H@78Z"_]N!HE1畈PeJhaK]Ԏ;öhfEC1C$AtS盖" EMieRq`e\y9MѮ OxFwf盚Qfh>pVd!du4qK5dF* %:if%t/&FC:NW[- `]-3LcZqut,jFSQYtSeM6at;~^TiZ i|x1KhI8HD48˽0)2Ket! Ԗc)}Ơe{2př sPH~G>N 7Ox(FD+}y0[Éy!46r+ 1L*xNF8Tc%cj A fY<1.仚 !J^jcFp364N PMD$x.+{2Q Fux T0\'KF${KhƔ m=N!c_ԙP{!1ו_&WGPiWReȥыd Lϯ?MSz?Wg=>yz~L6Eưn|a4hߏ<->RD\Ȋ g$,3KN<wmKΥ,΁/”Ǯy10mε`/ƶ+rj¨~W՘?kdػFrW}9=TdCr$OM7Y%up{زdeK-6Oc3[ǏYU);kpk+N.VRW6%+c:[og%]Kw~;Dpn:ҺoFAS:ܶBۦKA- c%8oH_st%_Z}1w..b83^{H%$>ɷ{ lȉs3edzO ao޹Mc.sa6Vob>)~g?׺qgj"`~P̿L_U|Y:W\{ur~v Pʆ-Дu D@ϩ(Dx::CXZT1fUJ4rEEc-/qWuP]?{{_fߚ$WC9eTv77_rOPӨ{[v[źgW0jbzm+Lng+[RV4۩\/bORN!8M[`5,i𰼹4R$>DF{!oNmv#KEY*=@bV8C7+ٸx2|#ߓ15ؗiݡUO9մP'1$^sfʊY>jò:K*J!K*DJ(Yz+duQ:K +3hg*wDFO)f[+E,yDEMҔچ{{y_zC {@4bIK$h1Cxu9C廳4ʐè1FCef5e[NY$6`G cEhL;OqβMgo\mzմ,n#]8o+Tks L޼Wӫ%Ae]yE{ a䆞ѹoI  ѐc,~#=Tq3q{ŗ_>q(q:6PD E][C6itTap>wv%X̍)K t7dNLҶ۫`|a!{N! *GgaNeR79,{յ1+upC&ә"iON %tZt~cC1y.Sh!85r1s1)rԽ}#bj ; s;=gk<<MWPÇ.byp\AkϧxdlL{fZlO ˘ dm(EU*-BE珏<E`3s IFYnDۓs{q]Eu@k࡞fw30ڼL/,±.'$GPRh&\=clH-f[ΩJe:c i pi%H%4' ٧'=N+Ud'uE²"qAy 5ud LQlzx+%cH,Ĺ };zۺi;uBjkkwռh@:&b/-h8؎%iv<[yu&h7\Q^ݬpWcw(Ф "Gnz lbF\kv lFL|N5aLEI4:IPCHG3p:|֛2 ~r-q"ЅӖ ~rsgxmnÒk@* ɺ!![2%c|-y|OXv): Ԃ%\"1RsN)0g GY ̡FN(%\~8]̧/J^ŸcA+sI,P`kI^>o_N.㵐3/1~VzEHW 5U?^1_ʒ|Ax&=cND3am'5kXݞF"{@3ǥL&PIVɻF4-*u q:3V}։/;lmdcMZ]}9Z)鬔F\S%ޞcx1ixQʊ)2؏}֚|`)UͧUUU \?[1*"Q\6% ڛ˛Uq9ཐu9b+4YVc#80 Z$g^A)4W1ܐ6D&%L`T*e|II>\5 ndVWybʂ#GRc'{! ~̠UO5u2ޟ {Xr݋[UuxO<(;0~xm\Q dZ.3WӋ]X&uSmڌv\m8P.?ꈹrVH^Qnp9ÜjYp&gM;*0Jb \6 ˿z>27_zs8#榡`޷Xa{4W 7GZk*01B;F]QT5|i-Lp YRbV;!cE`S觬WDp=LˉFRn͐JR6|au+QC2dp[DAc+}K)1VM 5%2VHA]Xw۬# VZ/,H9Mb[em\A<x )@BIjO6HV-bq# Z^ki2g}ۣO&yx H-4=]P gRvZ`A&9wbsh7f+:0\1Y f! 6.RK[k%1E S9w<<@nC#2O*O3-9QLDMOTdM R]%nǛo4n,G7-;0?@9mӆ.UYG*C*\rJ ܈*bR%jlSodٮDlnn͜S*:Cj>(Dp<-}0(lH؎ƀQ:~Yk:pl mPm>{T1pKeZq w(\,b!3K88P8eλ~p$W (2V}i)D8Lr^[g|C=A.r/k@"aU=-oQhiT.q좒I7u: M&.p",-:F@0UY(>:c#bzP2~Pih [ό hI%.)Vocl42v}~N+ ʪJ ߡ(GycCq/wD <1G|oQx[O2H! &Q?>}/clE1VJnI鼡'&z c ((y# *?ga(}#MRJGB鴡.|qHav3?N%>\́Fk<5^3L&&G;D,vvͩhT1x@,r ԰F(KQ!> Y˫603ZIcӓ3rW45_?]cD+1396kc6vEpC0xPͺ8\4Ymɫ .108zJY<%(OJZ++8][a5QnUs6vF0!$}/|a*Wn9I $C!1Z]š=&ipG_Ͷ3a.?etz-jI<.dZtP6t^X{'nmlʆ lдl9JuDrA76Y\`T];ٻ޶eU[ɾuc5ľ }:%GF$[+hMe"w3HCgcrKg0)Qgj|YDJgLXiXf ٥nZf*E)t?.)NdwxR|[3f֝+奆RR*nI_ %{{}%ya烓%Kf8 y,4I1X 4fü<ҜOS3ĕz"+ë==y9-Wxn)8;.%"F1tߋXc3Ba3qΤb?b"%f"v:>0vPg-Kwۘ*^0r.犮ǭEEJyƯYp-'׸2g)*r]@8 yzvB]Uh)c<^*Q0X] :IDܝ.Dtъr9ht}i⫃RYURS8]{SޱdI(ˮ;n;=1_ :v?qdwük 8ng ܼjuT' 1hNUBjHDN"@"ֲJikY%CGdiZ xFm):9 QXHK8mQ3M}$SeWz0sZ:O<2S( p\a(Kt%d Yfhm;tv>qSS!Ǟۧ$'fBhO2KV@W>(@#^1N"VjBca+8"6Zω^@p)6j18AM`RX[*er%*puUS`IPVf̠K)`(O$3'\2]$D%X6B̠w(!M*DtՍzJRR|uF siZ1j5vÐ :2 !wGEЇw`e( ˌC[ @Η,y?BO$a3ԇs֤gߚ Ͽl'gV{}sw}sm/;-&_~[ ͼ{vo:ͷ={ޘovv]߷؛Mon |yݿo0QQkm7|>̉aw|Vw4>sGǭmC[-&{hz{ߚp0<ͯ!Ƥ..oi2=0^o"Cqq#==L1#8K?My0]0wx6<:I|r`vwC3|njZN48zu98<9Jw4m~״|cMm=A I8<-8ه'Ijz9?ɰ5p^Y[^LDlҍW 5r}0{$[=WɋwR^qNݝӧw]=s޳Q >Ls'dgǠr{(e8<\0\umh\Xlf;Z=W}k.Ս^ȏxpW߲d9vN[D>BR$q,|^:qAx?:[0|hn~; ȏOt>_v?A#/3=K'~Fq^o@{Lo?M ooQ~9Ԕ=1+S`7I9Υ}*,]q9S(T01u -N$G}pta+/8Zdv)2F ✩E"ӛVHu"XDR3*~fODP<jw_;[[Mr{>2tTޙ4ZkmV&KkT&Kk5k͵`>ĽI~5eÌ Rڷ2՚W/2WČp Yl;H sD-h~b@܂ a86Ơt`:x,| d|)R$E]Ֆ/pA/a-g3EhDN)*MFOiZsdXbl āx/s%7kOlǘf_11n[3?%XpYuV_Z8CEq#XZ#p܄X$1a)XF ˆ?<2YèoƷQseHQpX.#KKfA0/S )l ɍAS-5T;b=<Ri(̒2S:*t5EΊeX^eX^6_,%*W^oU/0\+&5F|qf"ua.`ȁ՞7Zy%BBPπSk.\L]|-EΪe<hpSul:XU0!JB -_,"X`0F10h0%&a56 ޝAQ HX1q~U [S*,=ZНBxc(Q8L xɘh!; U Bt`>Uޔ2MLJ\(3j+TZw?RcEʲeIKk-SCYE ɑHᵼHt==US]E'R"sS ሯNǀgb {^vjwA ڡtc 51=4nt#Y*1MȾְ1QN=КY ;M ݚ(Zbvs6.NLnnCZE1m!1}5³ӻȰ+Bl~&،\V2q2jzW(C-a׏sfhx(Q%jp[BD T#?x/vU}'9}1[`}5L{7SrǛg~h6ndAA9+M~*,%ڝׄwލ[7kВB7^<kԍH{n}.-hv} G%lYVG#}/4AGZ\+Pٟb[ ؓŀ!:-b0n{ > \[U_ kl;vxg00o;5lx';GxdZ|k͵蚊P:),#*צSDݧY.dE|cahI\x5bs)ӻPm^'_c>h\h޳{b;6O )d|bՉBm!G>@k<)>ziBC0cʛ(| AB#Pju)ɧ9НzΟ9a?nןֺ=%3)5=哷mxQlUh݋y6h7pi%^9ṽhos-&{V]j ?rۛĖ63jJ a+-hڝRj s?S{}p@7Y·j}I9oRnkMP} R %[T˻|DG0y7A"8D~#a:pRsQbӉ ]rvGa-QnGa'*k:Dԋv` KTۋbtDjys PjFάawwh7ܰ;':<j;nc\&y2m_hK֙p'(hG/i-MЖIf^u'هLt2j]=봰N.?}+_NR?NgۣNso] ,v/>=J^w|oW.{kV+ ) @ لQ ^cΒ߉Iaۻ5ІV7z6㙧Bo:|pZVBih3ߵ&/}&OKi}\Gy;Зvs"Zś/9. `u/(lpGl_! m#q١ *쏴ɕuIRUYf -ƫb C UbY!Y|aheհ72lė#٦I&mI~ojM6iۤmۓsdSa,./Cv( fE(&*I5B }v83=^# Gb+3#eݹVaG;Oϓ/wHRIx4Ѹ' iW>`8JVvH~LBb3~B̪T}I4`p&0م] :6բS&&T _27uZ.)7I_ikkRz% שtR"PEqĨҽ\6o}`:T3{} Z3=?'2[ݧn_.k6j5J>&!RɺU0h62/oGLӶMiy+5m۴mӶM6m=m,RaN\|T 5(( ,gg}Aa;7Za=}^rR;vֿR>3=AD1~l_Nn/s(Uf": S7 }n#BqZodAһqXiڃM6QDmMn91d*n&77z2U(E @c-RAHuh8~ gھ{} 9?_}@D '< {$p/,s u\wƠuN hVlT7J˵c &lT+om¶ &lݞ%'K!ɺHj$Ex "2G&M E|v(ir~::3X [X(GRQ, <GFu:\e)A2M4sH'ůizٞyN&SƘDARSJI{-0 0K =z]Kq?d3=a=}^`+Q\iX?c`Dys'DV{tlb%u:m Rwk.FoĨU bLɂe*wѬr-WqP+h+-Wr-Wr[U,0 Tay";Xn%,@\eW.)%W Z] Agv&g!C.i_t&jJٛcjTg r:jpΑ Ȃ 4q j*T>)7TPSAM5=4OP1ᄾn'XB3 }T1VHtXB@yh Pׇ4ZVL,]bhdU5GLjud$D_,6M4sH'~|O?M4U,c蠺mkN5.?Z;8a2PJ"VʪPNQU{H=3 yHenCk~ ЎM[xJ7WFkk ?o>/ӎݠdPyyu?zw|t'"{gˋژ-x]{(δ,Ph.08L*;."1typaa?n!c1 Rw:G;\1g:}troUy#i*gf"ϛ3G3i3J?ˠO> nUU ar𔵤b3,}HjDR넑_I7orG7-+oH8_'l)?\Oyt\tm'? yw#OUGıxzq~|ov?9NŪ.,rXV$"Lǿ;&{LF81;)\M&*IdR tDKUȬ y$$Ǽ-QYԺ#Nt|4>s܉LVokdfs×ξ2$ޥ 07grff"@D%yΤwuuF~ Z!Y"ˀbn޲sGv,y&3~-X~ʂaKuU:EcmT+.LJQ $xH"NR$S4ڵz ~\=E戓(KPjF"ϻlɏ`ȶLS$yPɱ7q-FҀ5A텢6 #YNDX26cx8X[WԠB@攡@UzUl⪀ ԔANݮ1;!_LN}ZMoor>i 3V|GxLzU4\`_7r׃Q K_^|64pk:ݫnHe&9k[o/  :=wnų"{p^$vw-UjWsd?݅|NԐjr OҪOiQK1Z38hl%zs['*Umi%`͒\vrm[9QSMv5idQ~]]U= EԱU'&|;lzpRO^x^8VeEF2š1v.c,*Fw*dL q ~?'?d88|2ۧ >J NqɎ{1tF,ciF|9&*0mS A)=ElbA;%0별%N1bb 4xwnP{S$$ @l9I!9 1 #0I'F_Jvv6RBe"Gb'{Ugi\P Et(#$h0ty\2\s<{di A*5k2KU +JZ޹AZKqI qIo;q9QQ*՘Tl |zJzP[>58f,z_;xۦ:eJk%Jl1 R*Z72ڽjI>GԏzM*&[Ln] 'Dk VFS$NI6dS)9>{@䇋Drs[d.!XKƉ9)5SOZh_/{@Χ|/ob5\Hڔ .|cnv9qGϯoG ;7NVhvISOͩƒ04/d|c.O:9B\muPj-uAl at :`W` ;qXF2+aIa4m%lU  aɽ1fKuXnIs/,*]vto!3##A:J;kP 3]LX|a!*S3w?|=2 L?}spQ-6.M+"eM%l%4&1 ^-N$B{ߠZ  O;/kbm0͹pm;g0,>Ȱۅs]ϰVZ׆ףg؜2Gݳh54u> m7 $T[ܐ]Ʋw~l-1^CxnxnSMqkAxN% k;AmO-n,fV7N]Cq̠Κi g9&F?(&OZɣ.Z:rXzP]bmAh:]bg0*HL|Y^rĢ'9<H9I mt6v, ؘݪțR Nً 5{Ybep|SoZ#:~bTѵ)1TB9",ZJ:Q%)A k| ͛l4dbI WKdgd{`#pfwXu~w"hǑ5X肛ԄS&N6\o)Ez0ܤ) !*.k"(Jvl5 i@%G Z&A϶13g}9e.OsxYW[{gl'5T:úf)q|QpYHk  kYs4a|F!$6X޹A8ڃ/z\a;~*nK YPrgVw{쌝%$ Y ˁYPfNwMJ[&hyðqR8hvre=r+PI8BNiV ̑CO|uA[JPnSH`9,#MӽkP]0_ c#[J&'AfA)"zP>vn`;@ jaKGnFKHPo8ΖZ- . Hpd U k[2Y|cy2ƽLGLmqfFG N  h-s:}2Z]= mҭ*=ih:8񋛠4??/*RLX|NbdYBt":jYn\ky`p[6ƍ@EJo:fRkm`ZTAq'2/j`植jhJ"j/B#hk*&8l#^<-c\nKZOohqԎJL' !^> \'-}#FK+=&hqjAF[^~xNyh]_;;1Tp9kqx=Gpi~2mÖ=2妩!ң(ѳ,{Xa[-\Oz=֤;9̲re6vvV=Pz{=ݼ]pu}Ʊ뿽xyUoқwW|]oHG8S~[zBrg:97^<8P˾vkNc?E}'nMg;cz8_!`_! $كa%*ѦDd'M]8p. aɢkR-qqd'-O~n8L0s5k\$.dTo<_n))k)]~!޿/0yNa/pegwj% fSOdO2|M|4bi2{'^]A{q]L'\yI}Sd-: K, 9Y,b̂>41zv/?:3 O'w1~+[J{quN+kMMY+-*.b;ӣJ_Ά~8фbQjGRR k, M'yfԥK+sD{d\yq!֠6jQLedl ӤZx9r})#x ԣc_k0&eӡ%N}ՃE3_tmn>_&;<'^@zߞ5AxL |6o~Ec{܂*# g4A 3A. , YcbvO/$QCڏQz6{1;CMD.o||׌Jq"If ޏ6*Ɖ c cl#jMxJ Om& M/;FI"B"^>"(v(c0^ˆB!5o{pg0g;q!<1d0=)lw:\1޽Ya'/AlKZ, :椱mBP$jlbSt͕@t8#%UNٍ7PljAs>.9@Rz=R./ \XdD\9;%K?b5,U4~9{6n6w>4,|:q 7O'hAhJ-2\KƎ%(e:zC%6|dt&ۘ3(;"d|`2-L/o]nЯ`dK~,~̂XƯ0z;{iR22Jh1uُia*8xsScǝՀ i@2bAЫ s9 ˘e~/t!'qKT` NΤG 1Fb2޻ܒx-{>3~5)n˩9LJN\qF+)Hit^ӧAIF6RH A-g3r?R22H\9}2A}XRPi]$}Yo_L1Vۉ@ .cL[[dEȖDak#祇E" qrMN:'$H&NtL9ԁ}b&&<׻)xa cn#t|.'r 4s._ pbi Z)a V8FAk0PJ+;8V?RZS 4^2!Vt?rIֳ ǀz4h(7m[kV/oܖAօ\vwtVJd@M֥WQ%:)U2n:V7A+!)A}?ތ** Uzϳ0 J sW[OҠTXVDHLZ0i !^O9evi]9ld h9RxOh|+Ag|:[ ڀDMbӶ6_yt vZdv4M#a]"B1 "aALƜr)MF4@zXloYڴ}8K_V]hCP&ǵ4(A>1C>ĘcPזk}KND!u"@pL9T3 ߻xe?ۇ?=F=(= '/?b]H///K=Hi|~ׄ{| ~IM#}z2Օ Z %"Ė#!a J8gSS΍;`fqtu3h5^E`Z[!ֶG+HP5YF`HK]Gb.-c3JIM&xz)N$ caURsySkb/5r%:ՔHJV;W_",nE姒`$Y3WR/H6p5JA/+L$#u鞛RsjNNꯜ_8Q7(/ -B*J͂3Bdp;2 1PcXJlXZ9 "6H، <04GLBdڿ^itE pَ1J㝨K&1DfhyO'O|AC I|$t_U]ip$qdp( <t-^:Bm{4͵,#<4i PT~悳!Y$@!kSsDkSE}L$޶VgKܪ 3v>FD>qT-bo-\"2@zš3@YkZYe緹b X;dŬ0% ;I@*R!x>a$F(f;Pa=ʂfA+ )׀C8}K.JcLB8CA(Fv׀@4z~)ne Jvt`@RSNo}ݨq00JJFaγ-/p+5Ep+YUKy($+"f oiQHXA΂0SJh-NtElfT0/eFUxKy S"[tV qfaga皚,QWa^MlsFBpf4EƓЌZjK^U*Hhszсgb)Pe}hY, gwؤdbӻ!'^Φ'$2un"Xsy{$ߘ⎚ ߅w0@]צh_iv8u>oT.vḾb܍<ƕjeJjp.)5vv%rN9&G=xSYQ%_vq_ H,QʼntAJQ Sj1)" EbChTڱ>^`p rPCZZjr|J9PJrXXHB[g 3N=$y牊 a-c*@ xTi(0ĉO8G5V>|Ob9s08O90206\!Bœ; þG%@Ku J\TvROe?z!*/xVR 1m&bWLwXDdBlVͷqw󅔒A^ǡK I|ͥGH.*S qe^5RlC0 `& zp QpEƭ ^&!BJA꠳HTR Rʜc!TV64<_K8Um0,]IUL-5r@!B"1)aB;DLJ+ԙZlo*@7;Ym<޼"YUr~zdv嶞: [ZN3Ƣ>X!S[1Al[mEܝYy‚HS%\:G&x6u7/2rZQou>,ɰ%h(_-ӌЧϤDvWD2A`mt˩R*.Je[ R?!ݕvL:]5nJPsN<Ă s4C `G:#L2#ɶ>ܼ͑ܭdTb!'IEIzt[!?'O|{RPL혖 ӷᴹ}6 XK 88 i9pco-20B".b#f05\xM%z$Mں`F:8xE ,/i;Fl(i/m(D%Xܓ\ql7PB@IKᤋṲ%a\ZfiT4h1;At`8wC5 R?R<>HWw薆 K6>e0{P=ϳO^~=U4 ĉ4j4~q8w77v>_^ .խSo;R< V?q%틠cً繛M|'we@&d9B .ϼx fA32|waYI+E ʔ0sonj%Qgaf 3MPs4I*†CkkrPmv8;e0Pk(YmM C0ylg5pZ8[vr\`,C.cWJ٬S"Ă!g(FxOxR0 L` B%,,\vӑpF\Bd)Wr *$^;O—nK='ܔ?v`x&HV.9y˘gzٍ_ vx"9$}]`<>g$s ߗ-ZZbxppb}bU.A%XNX* h:+m5\y)gW`;~{XB ['pj&:rƻI\ˎv뷆 q҇?HAw]+$!C9kq3ͣ<_굛\[=.mw+5_w `? @ٿtBNeΏi[`n|ZxU*Ӫ6gXWeBbwAv,k!35PmNJL9 Ա@?vS(;C^ꡨM=O?s3WEs?m'؜ }-W۲~cV"i Ddϯi_ڴ/m6mK7m !)J΁e)UA VZQ4N$P6nG2VP&g`.86؀!8ܱI9,(Zc̟14&4 fJ0 z w6_#p[ᶘ13v.[=ق AM cAPD YCzqBr?p].FaTHf(EYbTL6b rrp\1=Ӆh:Gp>#!R84#Cp:돜F߹I{qL"H!Rdk=VFI_DE N#\ ɾ)u""b,UeIh'-ٷ{xޮOqvƻXV! H(_*?d ! yvv\i{l 7wGeGLA!+B c~]Wز=~}U9RC{rؒNCg.nنvڹ[" )EuVC^,dEhaUg=k};bm{]; l|` @t\-'R(Ϗ 6(!6"-bKAщ; (E@uq݉~ul^hTt˻~JH`") N(&"^OU2;cG!R>Zz 5El9=O؅ɾ=RJΡ B)zIZ|!Sr>ɈD/pQ!"5[9\Dނ7s^aKMΫ!5ZN},)^TTdSَddcEɺ_=Y^Hmd<=o#>_$bK{G O^Pl"[C6u&@Bקּw]<{l. /yD a/qKxŕ~]ۇT`YT3Rqʽ U+ (PzMijג!y|zh^Kh Tb9T%/TӽMjNX9[Fr Z[dTj{Гw."3Ej+\=D"ٰgKhWLjK 9;w#bR)tTe/5{kzwh[2GV&YYh334$ 5/+rYK.%ʸkϻx1 Z\zq]_~6p$z¶a\=I庯WPҫVs(6z>y>nf-oTϔ&7"֖~<[/oDYЈۻ# 3B.w ?oogJR%QܞQ=>3寵J?f@ȷ՟ˇڳgk H0^|ʸFΛnvg݆>E7.?{%CY!ȹ0!9OGC-{Rx/ZZV)(sy,HIitc7Sv%fFr=!˂OA,RX\8S^`hdJWRGx8I$񥔇-T4R~!#HL9۽H9|{=wNA>HĞNmZHAp΁,6tgdHtP'윂Q)'ohDJ1HP)!6 X0H%ĘU᥄ԔNvL6qҔ0!g_eF&8yC3ڭf 33?6bfjvFH5#rT*?Gdx$!:Q$#PgXd!KUӾ*:CDKx;OK6%Cva%AK5ܨJRd5BVi]u^#+^ޝ:Y^:; .[_ϯk2C1Ԟ {֖ii.J; @Jb~j+YLI18̓NxtG21L? )[sSuzy)űCɖہŜ 2rUFKt&qy{[rp.nA*j\2G"-I)h]9GEX$XzhoWWސ˚d WeBWZWM]kJ/! УN`yTGʴԨs5Rz#VjCSė&FHv4![Ȁk^+GPsk blY`5"K! #N7&dg2CPEЉ tI);=I)޵+",["7 $6؜&hXl@H$gHɶ8?bBX")j8Cr%[~hLT[sXb]ϡٺ{?zڧԩЪ&\sz׽ fl:oxL5Du}(#.߲S?oLW6LY϶=0?46+^\bj;viV;lB["ǽ$'gYC,2j{6viQϋs.pI #0 aSl*S;g2B)`aPgk\ȵWD͊&hF;"*ŦQ1 LcRUFyq:I4TQQsݳL03V~i;<̴W~}o[ci9+k + 6̍l7ͮnV1:ݐޑ0ޒ))mDԻhC sf udĈ10F}cQ[3mGJX8P펬>W n۔UGx1cM58 ^aZߥ(u6r7ٱkGI-s ?_5e`Ldj~Ԣ^&fQRS2tG0TSHFet/px?lFdйiiMiPv(8 ?4:6]#)_h_o%L5Y4Lr'%ߩrob1b@O<IJDꀙ`DkՑ̽81sk.}/a ($؄hEiVaY/SȽF ۲Di30cc"?&h^cT>xjjj_83 UR20+1RjԐ0Ë;Resuҝ#oj1Y`-3䂛 o~hE6w46B1LuM">oGa| xӑ)y'_ٝ:5ī/䦝{y% IonM}~QWwn{rruIq^| 8MUX2hl./⨶E^X | (ow(O"pޙI̡P{ЈuW8 ?γ2}AywwXYW8v>N*]0"){MD8ZD)b**y%B"tq~\$ݨT`9ng<Tf>K}yڏ#~9 T IUjAu^$׋(`&{U}9-B;w'5It ~΃Qod I {ݻ~=KȄdI3ϫq2~딻۪7I8P:z~KZ&Z*PN`?rڷY(F>\Pܟ]\P>LM0GV. ٺA ԉtc"tbmQJ]ֹ-QAX>MoNVٛ,P,gh@"If 0zmQxfZz pWূuY_PsCCY3 ҕgaEVJD|ޕPEKa32w0VZƨzk^#/?OUk^#)X'6uJ3n6fua;#&0 NMV:iNP{r٬ee"f\? z?0_->zSvic-YՅmd7(~~Ԍ]:TO0cmĎ+Vh;h$v{U$bA+OmwTlW4vA'x/㗏JxxҶ좸N/a_œW\LKv1Im rхC:.y l0;pӴ-$Ch^ >)'z;cQ `FΛk4+Ӌ1kl%c$cUq12,T3)76GFc E>x >N'֋|ݲilu6i z35Eqmu~ŽIwp<\J[y$K}Hr"Zwu9;[agXGGt,valf 052 #-._yxk~/gM1NDdOZ`=itȿk*aD~" p TGbtA{cdd ѰxDj=].5[~8 PQYre*/_S]D%͚Y!|a!ơ#l!D凄B,$p4!!UBkOBJ?WY^pd[[,ZYB6u ɯA9HD9U${mLacY5`B ,q c-b#/AzNI PcP}o`hW K$Mo BYLjD!jk^ 5LڣJ45"+ۭ~krP'Ϲ~|W% tqtu/n"0 en}R|K3*\ءYY08| }Ν%'~mBrҰ4:)lDUOLks= iQPblgfQ,e(H]_S+i)h{UT{'LNB -o/A?tN,Ф 3o< e$B V+SK;f}TWuI Kw_GiFQuQZ?c R ǟd:hn*}A"> C7W?no*GQ7N9>>=rnƱFG(ϕJ} 6<[kRooˋEO[@_u]V$l4 -Iᓂk8'\O}'~?y0x2{rͲz"}yv~O{6?Ջ8XϋgzK 襁8{o">WbAv5kfg-F$ߜ|)TQg=2<t[!7J}92FD}W%Ysu) 'ŧ[v\WIYݧ8h8e>\Xk3ߤ_lU yJθ=ԛ>V+/>[#>g%Of3Ml_}_F3|W~hxs{flxW=$o򫵔DzT͖wY=~ z! o,ts9VZPf"y +PwW痃9J%HPatRjT 2Z],iW:XgUVLjȬF+[@>e ₲W?nϜݼ]ֻ4?"Q Duъ b r[t]KM~mIV33ݴ!bJpE 圕 ?%SIIc1A16oAy&:y.i]Ix&yiv}`*'~c*E)V]\4m9J.f GxJC4Jq* L7lT10iZѳj&ͪ4uxUW,F 9d̋MXVzuq \Jl&s8F%g5gY051k͎J7h&O}$&O}a!?%~f ּ֕l8=KWa%u6IjdMGY@Bf%Z4N1O`4wʘk%0t%0t%0t%0l*1pN~>R8T(KrDZebͣoYجs>CV15/x%-s􌎋 0\mv"s74}@khYFӛ8ؠ=;s T &Fj.@[9bNluQmKMh xA=>48CO =3lqƠ6515:& #vzbc艍'6Ħ GmS ()ec@RDJUL:p!)S/VPU,i`M5 ֱLBzWfx+q]J9ܽ%7UGDuU0 @3ls(e81* 艆'p2AԪRt&"&2br8_1zv{|=os)Cϥ =2\ʰK% 'f &ݖX]Wg;p5UwogWo_-n g[ƛW^?~t?zT>yi}Ώd[sGP'NV30&%*jf7a2-6P s |&qC!m̴;kNdJa}LAՆ01 ?jmMn{ZHdn#<ro.ԛW%P]sqMO*4$`pйljrڄ=T`@#"a_- J!^*--l8C ?xdY87R*aBukV1^`6H  26e+֩*'o f²9Yiw~ V܁~M__+-E7XT/7_]\M sϽCQp?<}L5!X+OcK㰳4'qg>KA[&jr%NEZxݒ1ԜЉ-Fn_>7{'aυ<љ3^Y;{?5G,|ƉbH~l\Q:nڻ\9o\,S&VTx=Ëw6[|g0V/~ wc_@IZ0KbRII!.drPS]FJ2͛$܏Ga@9M2|.P+-KK%r%6،U\@%眾y3/[N_+ٿM?gTό~ oTvuD7d1: ٛ翇qsmx}k~&̽3 G?zvܯF#=Gdw;̼e7oz ﮛ^hL[)veEEy .vSхe_x"("7J hh:eh=K~'q/-,Yջ8ZVL5;_ <`c`>bqlD@acA?osW7[xw4C:X85XjPbfRr\ )6nPT]-+"eN{(hW5leC{RM m⿰#QajL7sjRLf9dd87EvApT NMϮZBSR MjK[6k+T:Rpc*lAZLv7{Qw !k(P5}eWbgȆ_SYٗ&9W%Ϻ*q:;8'4ې/hB1L߄6/CI)jBB:.8#L͓X`2z ☵fwYd釞:"Y q Ʌ'83y_RԺ@w֕Q>(Vb䕊0ʤ/!fEwOmӝ!" 9`>nav:+R˷x36jn#Fr %[+Nd kwX(ߣX˄i|Y`\꾓C|CJIvG< !/abLChI>0oGw|& vəA08+촗2^ d-k!/'L`Sni*F>?eཁ;/haP>L$~,OX{ A )(av LF}F(𱽔#p(coySqdPx1={8dkq{,! 5]ٿkšGk;#swXmmGs̆;DAЃ(/4;o*mLjb4} oHfCO?Ү$t'^0B`Ȃ9;0 8 [pwTx9,J>Jow̱DaxC^yaPʺp@vGKOڅ<"XpIjHCyxﮛ6X<$3J1~aޘ7m#D%I&!KT)::GjQILTdj)F H>upF1>R *tBPB25gBw6YMK:ƨK>  qrk+R{G(d*faCUheX7'>['!hźeK'Ǽ|i*_KUc@Y,)墳kʃAB*l(\'?= iB)Tbuū. b+ymjA9{C텭Xg,{S;&iMl2b6 fгC&1!iUR/-h;IUWjTS7{Oܶ_Co1df!/q{;7m2Ywp-6PǓHIؔHVEΎs8YPBLWSs`EC) *i@MGf1eiQD(F.(% P+1F‡4R0lC LVsN(N"kdH-{qހs-‡%  0OU%h#W K1$LGQZaXF0RJ2! jOJ(3`ZJM(\ } T.2̄y:Z VU[ `3"kdwHq%w"}ȗ>F^s Y)>5ZKH{栜=f9`p R#K`04V(k6FPO1| <J2`-fjL/^B^ 4VtӴ 7~ ]c.LA ɠxB Vdu-9* &dm(V:PH*^plVr@Ŕ92TabxO2l#ͨ;}"(hQa6863 ,4pZ3g0m\`,a&|&xM.%4 +; VBlg:ǸѴjah;Y]pܝ,݀ A(B- 7%{ӥO1ic&RRހ4,bbatƊmU*춽E%Q@(P1EVDiht-p Je=)JnZ[b30)*xDE!P'PBUazG)\PTX(ySHmƶnf`V ]ėS+hT˼q7:A7H|QiR Fwxgrf}Aw"܆0kfM, 4Su U?bW'X<4M3Ϭkw N^۵m~nn"&hw&q'o{0||/ `Maf7!ɢ 5$MNFxD7Ǵ6Y?؄LcR f7"\ԝ=8[CfrDIjPD2OaWKG>l&R 7(uVM`dDX\6ԝ/^SHɺ*UOD3I3u t$ʸ]W F8ۥD-¾=LnRpPݔ\dbȏB0^o(1\;wէ\]8E}W3I;u.|{(f'lY)ܺzaD8 N%7?QZ=6}x 6~r֓9щy% a`Pp\LI msBWGZpj&z8tէ ]裪Bf\EЗXy&4|C(, LxY勋8IB.aM`3؂&[+%5j)%6&$4ZiSt{:˵p!!q_큤[mM޾[0n'#d#ZM%O$/d+3К1D#QM2LBv(}p¸=(rϻd+v*51^3dDZ~nbO*}xxȁRq{ u7Vk@]ֱJ1\MkPJ>=<`^CbzcwiKLv&ƿ꾫s=.e|1Q.*fvp|ΒԐܘB-!+{TC pcTIGs#qHm@vg2ywҡetWM!{lY%ܪB-ѹpnl ;Aމm͹M5 ?N ;-:!^# kQiI̮}ǁMzEG8)}(czA2?ڭd5Mb&q0[KW !56WJ!;i8,Lܾ.[NNnk9nTkwzAkrҿ(3/%Gl\%uhey\K > Te;/6֚u0O|oM~QJ5's{_ONwoq?N/3ilNx4rK¯;NI>!\n0M5g|-K<=6z10Az>~ &`$v"gŶR&75q ܞ{3 E zs$5%n;il*38ʰݭG|o凣ٸ(3_\  ôsF\/-eKmNƁϱsٲ~MQ:`1=q&4G!F0)};z]3|۷~^Oo2 {r NY7_k %dTdquCd3)e˗x/PoldBtnTRHr"뼡9u[uz=rtq0i DRY5:W/sKә#3-N9x}W%岵+k&@_#oVp2\2̶`[WK;kJa7bubFTVژZ ߟPL(.NPI1$8g"YVnw= ߁B(A{S .NvIMo$-E1votgJQp;%5+,G lha2qep ]()R@"?q H(qŃA5p:"?i.M!Ƥ$g8IGh۷Wff#1iё󠺌4֎9uI`T#P$Qs, O&LڈF~a|5!H!Zߐr>ք3ri#I#O*OE,wXit(޿g`7G7f}08 hZwm({ѥ,e"~qaNPLߖos4^krҹ* *WW0Kq<^PDkA)N8.'2 %T%A PI;x✢ER]C=Q fq:z%ߔ5źzFH9 z[eFTzZ 0 D/ B} z&yRPS^(HhO "h"MAh)v@0A5?KM%0($"EXհ|5,r ˧\^Rˆ>D ٞ$q\L{q,qr (e`YވAWat}Q*//KS,yDE)Z]t|(Y:bvއ{y8?*_+2XAGh@5m"Rԙ`s$qj$6%>)%\-;-h?]Ƕȕһ`k#,yE H,PF<_5KɉX[QBWZTـ,/bHkauT༴Q`@VHQ* Rg]"&Rg>m bZծ&ǣ~#|?ǫVx~u~{dgU7< ?+D\Dߋ'% "z}1F;  ߐNhBwu b^(nZΩX!ir TgG$4?HO.J0HRoljKOgHJ.S7K9/b@Oe<%Rt>U:cTXF1iIz8Yt(Ԝ-O f3Oϫׯ^u>̧vPK1=\t/ec]9!H*ЕF1Yh(0@rf!hZ`+4KƳ}Ц qPqG ;o:-&s.Oɖ a@&1 B~GY` 2'DEk!s" dé5o'sݦA9S dNbX5[rW94yRyr>7 EDʓ`i5:!Nu4YB7_sBqМF 6ܓF6 5찦JJIzBa1j;,5BE،e2+XA=q#T?*a:TqvnJA&c)W ]L;eE>)rAWN9|@eԄX).A$XBvyYWs¿¢ :/{0B5uzyPs u c˥}J L>uC!,]6idMg;Lm^>ߙ*5.8j&]Qqn-6) hF`XԚÄΌBlBХ,Bt8o8 ȶB6 u<`֏_O$ RY{T$Q9[kWhČjZ= \^w-MGCmF9;sL&ыR;.FObqŨj;;svyH h.:fe旀[,?*o\+xd 8[)4;_ڕ??Ύw';Y؛1ɭHuNn5-4ɔ@K  jw'{ioe|Z45vۋhs7t7gCdٱ덝9}N_]WWʶ8kIbݏ+JEo 7Tj6cM: hi_g NJX(%c[&= r``B0 =HP 9"TBV'}KVK>Tj:czsun;qȐG=VyWl e8y2)KZF>Ef0/+G5^1GO/9Zhå]1>G*-FIoJӝ[};'{iFSځs@&q:ڄi<;Z-|IrAdӇ{}èޠg~i PKǨ#蚣V2h9a >v1<|'Y2_ hB'2TYkJxj2Jt)amq t\4ӁjywH"CHEB ;bb NE\DHtADH$7x&!i(E{i%8 $B )<^g3* IyC#:Oo;6ܟfiޑ[R4-Z]g6RV",y^2\>4&PJrK12AS1U@j,$"M.#- ^O%]b N!2@dA+F$N.QNbq@h-CjTі+ hB% ]Vܥ׵ 6J?O;XB`U Sjh~ sf@Opеh_j˟n̜*PٵhtjI)d1"Mcޔ0XYemQLK:2dgJ`'AѓIAWB$'):2j-]t";ƢToʄoA 8GڹX($ ]%ۍXF/QV=r, I)Mrw{ݧ7](_Xh .f52KVl6cX~^D~ uû4~1bdu{{=qnvLuZwt7_7 t7_s Ձ]xj#`?Y~v%+|]hj Hk\n<0c>ggR0;9# f4'~IzQʻIr=r/˻/Kفlo-zDӇw?oPA."*8=O_clXY&{|TW}W)*>06T0U*7JJJkҏʐ!_R=r} R)G~{#0VPg9Ex3Bq:_: #C"xF?lEΌ軧Uvc=JFӪaq~ɗp<6znkߜo: FdA e !n7gmmlpT q\ f|x[7xS3ߴ3Gn- C ]zл8 sJd v-Z1~:)$L0/Zprf Q-]aJ*.W^2({|o+|5{~A1C,(1uOgr%^w0䒛;B'CZ,_,z&41푁8&ɶFgcɞyo*LFOm))(%…"10X$818,vXe:!| b mToH;9|v :ME"mBV )r!{yG߲:ӏe OmnUh:XyqHUڨP9P!lqe}%i&n՟>~|:YO A8.JOoObM $~|ÏhAw`!&z;ݮjKpdwp EB:PT]|6Z]\z>aZ!;];VԞDUmnF_6Ieh4Ƌ*NvrUp%)`lv-Q޻O!#b83LӍx:pS,*5V6 |w_/;ϳ2q8x~K2!H)ƣ lt]_g61W1ĀUPqibd.EEn(cQ@ c\t>½E\IMwxO\]9Lfv=S*BħC.kO3CR,(QܝIܡۺ4o\-N!Xr\!6X{<!A 'bۆw;Qb9Sl'C—Qm`@+*}GSio)RY|ktTR5!5!zLI&9ӾvdjshƽDH0E X\{P?6GvjlmEBԹgjUZ&Q ,31LdZD0Vj}}{^%[SXFƘ[%qFK~YF12́/g|3ݻ|oA `ycPOb]Y҆ Lw$W/fJ})0z+e(La,g)9a\P+u"YBq~EфZnZ'2^5n*}%[UD5qe>_ Y ++)A r>ڸ}ݴrFZw֫mݲ?0FLwKlMp{Y_^P\_Q$-7NfkW|ޓ$reCdr~:~tɳ"]  1^K56BdMPgE-Mxp,oW"_D"C|̗7v 6s\R>Go_fny~YuF,xxͭ?࿯%w{htJ MLTz>KƽyMDRums2v˵f>Z_= =^d8qk^=N-|r]ު @mݾ)&2O}y\u3Nr=][rjN@2лW)`Y^ ..-c2rZ{珯sG$:1%U~t5? &uk벣kmO<#腣GzV`p&ϛ&@ّE'eNfx=(l 7x +<'sN{(W!WZ[݆N>> y./Z^ڀcڃ{:asqIDmQ yrفV5}̀yV%Zn=h ~l lB0ـi{&svi%#*:Q CHE,uJҸS@b.qE̝M62{ 35+!KF֌y(>6<fiFhB{twT#oZ6εN`cfĻLN9Za61:\Kiztצ64sMI. u9MT呧>fg ٗiwVj%aw2ހ0|*`o{wM1C ;ᇥ1ҢЉd- 56R%edvɂ2hǚ<|py2RBErpY(uVVŘbduM8(d^q'Js V 2.TFmJ>\{ky$"z!*4ȫ$Ld -ނpqQI3zg O(og )zykf)

k"w?Oq1s77ޓ|۪TE9ֻlst҇nOC)(||PFO10mzA$p?7w࿆4mPSzZ`+}U}wKL̓Bܐdoݩ󟖄tg课y |W4ʎGT=I^%]FwZIyDB%J$uehF5sUGl.yNF# Rx>wW = !G/ro,u-bf 7<8o,8MYZ m%B^0 p"-)?T$'4d[hhj5xL -Sh"2ѺM=VAvڞm_X^ ߌv^O (3aAq?kdxƽJPV-~7 GлЧs*"XOa&4}/b]Wy]1QuSFz/=ر͸qӓpݔԜ"R`-\|!z]~ m.Wr^Mm;PnOn+X襃|8lеڐL6*&'=̥tr\фU֮hB. m0Sh?-99롱S{'яiGchoN.~@7yeG0[lԬʂF2H҅kf%m[Բ2 1){ +۵y=u-xlj OCLñ}ٰ 3 NO"pyFJj4TN8)Ɨyw>2'WsdO@%-oJb:$Lr{Y$%lgU/5tV=:.J9,adt}=Sdjhޏh1K.Wb{Bd-\Q! eH+s| #xU[-__KOdߋ~_.zDaYK ~bjٶ#HI QȂRm *k K,rR dE l4,8axA/~/7 "tgxH%Fhqg;gknoD`礉 bswmCwveIlGώ' Daߑ. ~ Or&ve0tpOBрnuOؒJKaZ!:-4lRHW ~/kUrhRZu'YT4R} kَ2݋j`+m/!t6u_4֌OMI0& t%pbO:%"^k+eoM2lunW[!gloCˋ~|YD [tYsf֜5mfydY݁ԇJ$c!1δK BQI%SPHMBYw)Nil5YU7fCt6i_8LR^ܒgq4׋3gx-BL&{ ƝrAm4[n^Y~?ҐWb}f}i{U&P?_5$4d wxe+4Ey˕oWYkЇŷVoͬm>'KqE!jzvq3{W>_@QpZ%QZ\@`$ -9hǫ|@udnGO"Q:ܩh WuP+%5Ȅ ̓ kAy7]Z/}pسZaL_wx@mɅ_ܜjA(QU5 e>|:y,Gp":2R$8:4#wu @)tW:7}5TdJgc{TrOя0jK_*Ѡ4%%{#]MҘ,YQAPJqP ({M,K5KNINf)c]{ɝٻ޶$W|r".@dfͦYH3+JMɔd7)*`fb穮*^ <BX(,Q ,wݺmBۄ@8 AJaI`)ŠBZ8 ꕃ_*BI.s'M 8iO\X+Bq'02FQ,x0 ))4LpF1Qȅ$b }!2VH,aWth Z, k qUrkrI1o~.Ms>b,vwkKGz'WX|_Hs-ޑwo{ZǞoOï~_{h Ϊ+?~ F册 x_oqgЫ?w&fƋ }gt ϗd Ô>=~GL4cD%VKL?zSG,?9s@(LIJK?$BV8(┉x us$Dl8m+Rz)(ER(OCZ2_V-!M-ZBA%DTEP(9ՊlopkEz1RՊ $THw)$ e0_|d$c0^v @t`HyD3_CHt $S*_e\,5c11:c8h3;B sLw?>:tCv!_X뺞]a+ w\3=gF9~6`+LnW6*~dDOK3)m?٪zXa!uk3SI WR|k>-}x'x#X/@:.\9H;uUIT{1,{qDU,Z;ԕxR0+tUH]r 7E wtHQ^;ԕ5vD*5l{1Y c c%ѽ`ѽ4M׷%x~DjXVQWj"KR羘xYZBlV=J勍ʺ˻p K Y`a>Oς41߷HQ] )teru#b00Ŝ/>>ޙ),­yx\/K,IW2V)-JA-!'7r|R94]._g8Q6nb 2/F?a6nʇy(՞B%W_jn 2S<]7濌߳KJ^vUޢ6 !hu@ߍ! ~/V;BZ/6OOc3yx5vnf5Z ծ2W̍8ǁ ©V8ѩKkv rO,/қǕzi.ktCda{n6g~>-_Qrfc 8({ [Nm=vӻEinWo)U鴷KXJHx :~Kt$AI{n/ +qsPeBe` 6 Ɯ-c8ҰN0bbeI ^J]Xk($/0pOn&a$xobSW,~KD9U5 헼Rh~7OFROnKkA.tZR{^svǃ#8ᴯs"0!Fw˵ #5bz0׳@~?9?GƂG t:-EA9Sp{A@{YbK1Afx_ͯz>·DLVN:w.}#) ״J0PCb³iiƒ5<H~  3Lh[UxjӢL-ОЮ%^!d8i,!iwA+3-8KYz0TQ5!Jg"mP<(9 .ޑ&Ch;^>Lh:+pfc1nu P>W4e4F;GCIk;>pC33fx AN_n NUL*%SPTW׫WsF+uٷ Gn*Ikn˞`F %*nL7}3Bُº,<]Wwx" \ ^j\ޙN qłH:8G]:+bp:K2=Ę a2ˑ04Dq' {m`R:bf%)\nG0I!/3͘/ *T]z B}y3):b!bCkO:IZ޷1xCBp{$V1+A*֟_p+=\"f^`$p'ɷrL:h#F7~?tvNj}={ R!~lW3z lw:CHeǫ0V$Ҥyy)O?@)VHoypHJr diM<Qi!`+-4ʓG`mo<厇uA7$a%F)U}29[$8c|x0Φ)Fp =eHv/ W BPȹƚw2'^d5nʠ7]ࡸb</^vJlxf(JADT A|4}ײyz6V Ğ{z;zIBE.v!G"=nc'Dn6ZcXf͌:J1)B I D>V1Z`eEI+)s-r)vFE&J^}- -Xy /BߕZ^£/f6߻?%hx,-žXL?T|(ro2}eee՜/@50h<-Zc)cTq @+0LV^:/RQ W߯Fjͷs6jiUAa|WrnBqxkDC.D"93R~z`b9ts]zrbM&3Unƀ5 b0wc08l1˫ >z7^$~;q5",c':sFAS0VG`~;4P3!)a;βVAjd SqVs@;ڐ ;.8]B_ߔ0Ы4#XU0]9xU*ihnur7Ͻ[eYv&۔hS5洬yQ9PpQZo0:5L(zo";ok?zwwβ~9￿ݫ χ/8I'QuN_@$l5D{0[V%ֽd)"`Q,R 73Yd(8cR*1q&xcPZS!5eXMS:h9% G&3trxqun:շFFӒYZ[^פn-mST짢T%́[+Tp@#:I,&P|d"J1QH ȁnU )ts|[[as ՔVFI]pKX!Hp)rdIQ'J !R0 (|DDdX h As"S DQ˹ ,AJ'uFODNB Œ` 8q9*a0ӵA* &#d"#rQnG7^ ӹXyi3FtFI=~F^Nn">k<,再?Edos4Ox|͹O~^\zՂKox~&pm.De'B>\Q.by7㪛 JsҨ4Nrݦجk)hws{#8\g:U0VT <[py9DC)j5QfE4˳_6?e=h\JfjLC$=$Q mT`۾:Mp;,~Eg> ܾ++R A{7D.\e!~@Mwo#[ ;Z {ɫI)}lEc DI) /u< zCGRY}Ҥl/!h_Iї2*L܃24J<UDsiZ.يRg3l-L_ 㪥.ԡ] FԖԧ͟_Oh m0~-u@vKl9݄V8ѤX/_j`bMj.9o^KwIR}(N-!fǔo8z=C DԪO XRl;ntq0Vf3&;z[ėVSp+d* f= kJ13*,%Ӟ&רڤ8qfBwa^*gd=P5o F`].2M/C۸'rXh)bM9z0ocwh>Pہ|׮&8|ش>n澄][\1a%L:b˶fDDf@VZjkRv03fum0ӛmY d|[acKsdV0Ӫs=fj]gP1E erD^m^]*ZΪ+Tn;lPCLXuB4 n磨!J†!Ҍ5 QJqo|O+MfyM놚彙PUޯ(=I,ΫMU 9+f{r7vt|>tNj,YowJo L! qJd.J͂ V8~~w5A IT{'a94_ m,] U8i?S1X::"sCΤ&EDV& `7;,56vT@6Fmc#!щDqpG5S4TshɫVH̝81^W3R-kޞhU;('P87VU~4kшMk&U- 퐛"Ԡ3wiWSbLsi8d]D h4t=>yO:(J8c 4QF\n~-ugaj lrvzf*#k0XRPlz[R*- /%M.o~VOUY +ʦ5D+U~nBY%d< 1v^-/FΔm 4x͛YAE+./?Q!545A3َX;$\2n2wb\@QН;ofܼb=(S(tU !٢2Ry0.,ʍaO7S zǻIL8&˨D2`'ZL|#NEhxU0M#7N*QF"y!<Ĥ8&$GЌk)Xq<3OzgNք;X:vAqGsp2Ezq4T ?LfM>C? 'zP~K ܀F8IԖǤ-uI F:p(HuTy8i U$aV%u^]>,$K m^ I~4KiM?Ge6>Q7ʭ M ,-pJ2G)AGMoMO'Lt]M'W7o^d*.&B{l {?ő~K/~',pg!́_- Nめ=sz_}ل͸{g$msTמD.tpq크;o_9e_9eP|)DmQy5p 'fGk)zBɣ5MV֝uTKw}2̄bǫƍsF*&g_r)EKmddׅP=Jڤjܠ>$aѮu JnP)ax|JVʜ>[qy']p\ۀ.nDS)=AVj{5xPu(7ˀD4a" S\@7 * sɒK ^gҎzA-&NܶW: #Q+Ti5Yv4k2 FM%nSL {%RVG+' 6sY$w[+?{[X2Eq:ȣEwV4 $}j[ gPEP2 m87$V h:PDq`뀁)bWЄ k ɱ#֖q_40CNđ&\(MU/e}HU4kB|+kxFD!rj&`e@*UAҬdր@DJSJC:r"?cb-r Jq~xT2B\.!x`q  H:#G۱sw ruL9!4ou-߫1`dTHvNuog -#PXjocCA\AR13nrQIpXj.&AH Ad̸[bz (Jf}/*,u梺'\wnwI\X9' 9)5a 3srcC{sRz !gHD:DPLXKR7"I+uv!)PK$Ip,z5]C˧.vv )DN9Z7vJ^kE UJhrTCh6Ε;)7ߎH K&RSUZn68n?88W]n /T&h(X~GNE7о4ѻzhޝ5$\_J{55-P[jI&T֡A;ٲ֝Oat)J1۵N}]k~;WZK"cLc% Xk,ŁҡRJ")3! HQ 4uRI.k.ؤ{RvSY,2$kSW8krwph)Z9֝g׏BlZ2{ P08Э}Zy׳EH0Ҥ}ˊN_дs=_0^)#q`,;$cpk=u<,FSޥ8:Goys4k!P0$cxy ^ߛ<~a|2c;{#gµ~W`h f]?DE*EVB?֨榯*΅!UZu ?Wf.84#'G2X.e}]e?{y %d翶0V9Vqk5ؓUOI,gI=NGHqluj/;mA@zl4 "(y](]FDI#cu+KXK[@JP ? @#X/Q+IMߠZ F%sEk֐v)w6#IJ ]ۅ㚽 N'5SrOtm?9;ʐbfJu]D]mh2C;F)5w3Gtv3z4 9G3ȗlS w9ׅ\twIi.pGk}ܯy]uwKrm>֐<#_ /˹;FS2Cn-~^*{7}R~ !^q7=4 !] d%J@wYXXM)B3 w\*BH 0 +VYYbW~Ji>@yJ>Gmƽf_z>@m@|u*xMq7G֝98G&\wꛘZw(8*;V? hֺs/xਐ uWWӛYΝatQͩyشdY9p4mN8[\tStz*8X0]ay@H* UA\0qc*J kj$q(@ F憇J)EB(o]ZN0oWkSWM)C~mTf !RـsFaR "f<(g\eRJ|o]t7;+,Th=K\M!XX.%6!YD\em N~ 7{ ȵhPkn^CQ^z<_vXSc›OlB4D0DJ>PLpb=M\wǦ$),ë}| s1{Ūa?X5$^n'ܑ̘)Ф7/΋+3SB Qüg䊤yscrXo?]'0דOWʼqra"6FC '0lOnWk'Yunnl~鉉tkϷ_Ggz pmk3i5bqiC$?{ 9$OO,\1 '׋d2 $Op:E*v%s㗳d xϧ7)I$ŋy2@vNq==凟~p|W?~z3>~eutx=U?b|5^~bpwMO'f|]~iT_SjV ]m;ХΧicW.Ó?#Qt:H['L5|J́MJ%ȼGtyNVo&N_Ġj7[̦Q=!_dap&@Sj<Hr}>p)67D= ?hleqrt5!02y`O]ݏ^c~'ٻ76OӞ~fBGdylYa!{9ϧ?ſ`vanTe nk냯Ox9mg(π?~/5/ 륍2 pҞxuӳ[@{F̦ğ~2d '鍠&٧M/o!Otp0L~x>-AM.IUf\O/0[,?Xy6%WL6{}n͛W0*>50T.uHJܳ+::{uz 9<5a\K Yyh;hL]k>7a/6b TvȔ xh,EFwsH W ܔ'IpҺ |E>4^~B]滣{.v3s'YyFDbq@v/8sc{)=Ќ.#$<0߶ҹ ~p{9֝|2*s'Uhz '[+O .zm~nm~ a2 1!$C8 id/{:}liYl>l%[8ߢFj!9Cz8QW}[JvO +5xJuWEi߳<)~tMEpQ..@E_{aP;/KuIf̥gnNe $DԕQ";e] +m|҆NRR?dcDx3~\C,W Xl.v3B75~]^;!k3*ܾ+ZG¨J,T 529y,K2yzJIƒ,&dU AH:κ ]_z0!X)T||*f FPȆf#/ϿV}x;%%? ן`;~H(,,^SD?~}xx,]MBZB\:Fx'K~xuAK|d'%fdǪ?lx;6X]M:JA7wKoA4l#U?W%kւ;5TnGX'Dc[QWnUyI5]z%c8bClYy9e+"9băAvCGWf1zjBl{w7){|궄$1X 5S딕\p]0&"TΆ*62@E2x$x}9\n[bӑWI\TJyE>V)yv!`R"H* 0-KJKɿBFbdNEIQ6$-5IE)9)A2BHpuP,ze2HQڐ3d't{$KgGZ'射ҝǁeD!#؀1Nvs9D Qp(lELQRK&E )Da$ Q :|Kg`^+Jyvڕ9Mxly1C*@ TJiFRE.UʱND/ZOnO`*i HFH4lZ`j[ ؘ:)`aF+SO ҉J*YMԾ`yŚMH,ACW:MZ J/dP#/R~E `5in<6yJ0lq5h{s\SY<?٘;Z쩳`JRYhv*jaP(wUNCΞ1)j/)*yQXRV)u}ZT^2KBhbPN%hAXbM%eòT#puL0+pf R AYHxs>|!mgo:k"-b㮍5󪌨[]YGm ||2:\t+Q:poөuL9\RbRwEmiE9q}zpI4vމm(Dp*T(e,-x.؅dX U9(i~/xYyu#2j}}}qzW h{ubͬaJBqDg<:yU؆~ԭl1VVVmCQq>[$ 擻F|L(qi>h`HVH3|> lPGo7|.b}'ۍ1MBag'WDjÍr]R!ڀzSwWSczcwJY뚇,:3m'+kipEB;Q$AdМ͞''A|׻z{_bX1)( IQ/E0tAPQWҋcIRio7:lĮ/D/*֊E/%*k맛mxݺ>?\|3uu5O>|SqST`]!fܥF-zslս;㌮\h\yQR] %J껅@5RoJ9}d{O yJЭ_;v@_qx#nҾ^nHx*Kf ϱ7>{H9_ai-}]m/룏uRIgַACt8CkU="7ȰKO{U~QxR3'Pˏ-+VDZYE[Cb"wJ--KmwZBY 6(%4XgURI'),]T*+6z櫶AO C5S1-eEnA} 01^V%T%V,~"xRxWz)&KTRP'En :}ˆ".zHfjZ*-Y+ض)kų"@ x0T/]iP#5r&O@D% ~`-!st Nt-k N4c87(e# 8ԍt-kڂ+Ge;Qkgǯ_:dv Q;ށVDŚgvTUlHBNJUm`apΛ6 XMM"£<~w6%Ֆw|' "̴U9D 1# 3+0 ;-G9Al4ovh vY1]dRA1|nOEFPnP_ /["vڹa,]gۉ'$v`SR^l H7:gkvT>J9HJ6N4-T\bZ>1ì~uGA5m/0~'%Y<daQD(Wg9WWgtC=9{Euͪ{)SzTMT,xYTL!*IBxކlmBпScrg2}7NXXzqIx`RRvrOO/tĽ*="*y3T5!#:wrNL&QuYkd;&Z6a_99PZEn{gmSyo7Fxxh=^j,V*ȝԧOE-r'MmuvR|c]Tΰvf;lڮς.wN7D@=B&C|pCq(98nkPs$r?z0B9옫J sqNȱ49b`tsmcs セ`'=Gg/5GvN2"Y3,dsdlGZ޻#Ԏ ٚVF{ݮìfډU*4?غ؉ۻCh9+1d?ʄ[] ao4ve߶m]OJw7j Ն۝-ēa}`]8xf ^'V ژ/do]ĶpXܴzIifxKOp 7Kxg H%Tg%I)6F3>g|Vr;<];0[G9z)hlZ- o%;#m!>)u+^Z|1%:R'"Ň'H=$WTx~PhkԲ Rb;cr޲q !z1=0vd8Y:XKZN +1/_}{y/*Fw\_ ylcN= wqƞa&OyktPOY}?Sj`oqĶDn4gs:U7 ј"z $K簭)UOI)%3Cќ7׵t}tls?~Rm,RG::4flrnǿ@BfTB!H)F\y=fNuUEJ{$op(%נ7 ?`3}9sc }UUY]y7&=3س 4o圉5BKXkHuQ$p~ދp*|RF!ø[ Q# Jl0pjw@fbaؗI:_B *t/*CdoIXݿۏT'ٴC(säar>n?u)PΙ-B;^EX?c B`1"T6`=UJٔ}my13c: (q{4RT*  aHեh^{5R h-[sTd6)Րs|2P ͧe"[13eU7J앓E 8UtԫϮ  kk~V&jA8-MmChr1Oaѳm44R*ꑣ{ Ea~c`*^xou}3OBt/4e=\TT S*_)K )Xݢfww6c6L)B,jYpP""@'0nGZxzp4_VQ'(5 LA| :$tӗݜlh$ 5wmy<ڙF|ju6=NkQb95u`grX83;dݑ9B+ƌݹ+k`6\Cq |XkB-ܙpRg ml~zA3kPpci|>55 Ow"U#:+ !) Yzy*LJ+ jY:ț#CY|槮ozoj⡸WyS3SlMch}Sw=#Ձ0fs˪9zi=^Drfi1{Grk`67#vy5QF:¡ynt6'Y;LAfϵ`C3NNePG*S&{4ACWL(F}=CV>Ei.WEeCB-GZOn3ܼrݫF|09wkMߌvoy1FѕZ+ιI-:'e[1 ^zzhjH%iq.oz|}OvE,5L_1-b~ͥ17coфaiwI#K/ٙ~1d#N2/u(xTl)S9uUݦeÞſ3#~0sc/ 4%!]~O` JCWfH1 _IO#ð ɛjҁ`I@ZëV>t&dmV/-2Oqnt{T8NFz,a; KQu@]_P7*\5&]5|ung~呈\sZ IvWa^+/,ƐH,Rl<~i/(Ļ.dޑIA-IKSr_7˗, m- O=RƦ84Ma|GTr*\dmq1h1FXaY=4q< 5EygvAd7-)+Yi 6ZIA"jG$ȊIc˱+vQꓳSN 'gr*:H2ȔeIr5S*12 \)ሒ2*\9͑PFT@ TPQ>bUa+#bWNVA/hf9( ǼMoL-f7j㕄Yހ>%`a~STGaEv@VM3B 6Saii0j雁'W:oKWШ}C#gf/|%$.celd k `=F( Z\@\n2m4뭲RiI'v70Wp#9@(aVSiV6@O eX{@ cF X*S VA..|zT`!8E s*# rH)IQ&y\^R1nx]T0b kh B'CJhi.i>&է@9 y,^!$s4JAcƀ'FHg1S5QDUsmwS$L`@KMgB{'a^:,s0p ޷r`nC LvnTg EqUk0ScH*zye~j?_3(Jԏ#\Rz}NG$ ńzÏweDcJ?K޾y|\`^u5|ÓoAzxWFz ?/VS]p1OqVzUse拿MwnVxt7}Xĕ(]_yĬxY 1kwo*1ɎO7>&ur0u>SA2eD_Hq?^HЂ>$iBx]i_BaFw)5՞~Q$ɥ s*4FGrRWn0TM!@"rzWej:cβ58~C1{2sx5S*Ek|v7F +;y٩]IXW]BL*K)Dm{777Lu`~|[ooVW'v,N6Ud]}LAenMܯ<mcVn5O_n= gf:޸€|x'ֿ34RJ(Z܏|y"ৄq q{^3Wiq\Z=LZ1i3جde3m31ll&zNtfFpc8:7;>%ydPi'a8 [D8$̧;rb{f'6SlUC;Y֒gӎr$Cy9b^yTYOs6qX E 4(l :0k,>Z)EZj{h} A6SNmܡLIjV .]tU~ k.o_뛪ԁ#DSL呣R=ynq9O?B۽x8tkP.QHiͮ9\se@xƐ!/7Z Oѝ9>"8*z5aiPV?\W &N+0E|| Ӿb5!Mu ǫ΅իVbUj+΅%w.Hu*i Mn7+jAKg==6Cַ Jf;"]ONmoޟC!Ԩs?s@WP6V YgJ:D)kؠ9Ш)jC~洉k:[x{ b߃cɚ`X!0QzvmNވ$Xdd}{j6:O;豫Z_kڽSp4>Y?  ]5 M(ښ1F;o ĿZu;O5W?z2pcg:ȍ@Lׁͩ [92!y 4ʱ+\NvTy%':LЪEA2e",+K7/ADʛxfv=: Gn2{ -h_MգȓDW7hk kIܷY"Ӕ;TEWZr)6l/e˄T00%mf rNe3T @2TH \!t<˃s%e {GUǤ6hC.kAP}c]lٶO ( 64LwFoJfLgWbdj<`w(!M='I`@W43E)NH,4Ԇ9"lgeZUFJ xTk ^q3&u9ey͛&?.MStݢS\3t5Ft[@l2l\73\A #j\/r"GMDwBxJt5om |ݻ N~ՈǑ*KZc6W6cNiq؀TImPnfؚako:BC5TR$jMnoHJ0B[Zչ>'֡}V j._mlrE($²59Cㆾr<bIhSKmnVE9cxL>8nӴI;&CDTQvަ.HJIFm(\<mD6*xeM8`>iD.ĉ +nzU̼ 3-<#B(>9e06lx2Xg]^^Cڌb)2HJF0SYRxǣ%Hv-A[b#ӽX,ؘ ]UdJK!8%;|{3J } J/DS{%M_D~pKh׼҅ӥ٭yf-DIF5˜ a] }0]<=)%Oy 7;(3!+]//6$܍r! (nTTLKvȀ"*lRS\/Ӷ!~/mM?x=b?z:(@I KJk?|gJ=_T!MuAɇP41ؒDOӦzal}WEv8wn ji"B&G.e|UO=9"@䧓o 1qbܩG'L?aٛ7:JwuF[}Q;J1wٯ?ㅿ7ߟqIg;LI^wG^v//b'{;^ߓ ~+~NkG^ϜdO pf2Mo`~ϰ?/S v0IN^! tD.%3"jyz Չ0l%eu&}/P z&wǣotL1U׍t~rAT=K+V܅o' i۷TPΟًPWKk2 N_von? C2˝~0^ ͥ-knȓ>dXb;I:8y:aҸ_N㖕33*\6[&FMiܪ0PiFmVr8}笍_I]Isiv\I\6}'tae7=Y{wOB=8f-R=,sKJSHʫ4ZCp0QSƦ>?NFw̮Fxizr*2r2Ԏ1FS1wDQag"y"ˆ95iªAYPH$9"7VRq(+`Os\1U5$&Ai WhB{FMFZ5(n1 `FTb2c-Z]u(+9>ҳd!Iv:j4t1!;Jju|eݓNkW8Sb}Paƒ l9#Ճ5{:Q3]Ǖ0c:&ֶi(&WUEW!ĕ"P\iChi&!bd) u$RALaHdC#,4PkVȪ`<; ,]N-<%wjҵԊ` pסq;,cWDbNso}6?1>̚x25mB&LBk;r2}i7n"CFI*4ֿ7r-]Wvz5eX5Enn͎Hg͇KRGy"2Ƒ 4I`c0ڟ?ZUpF,;F9u~p90) 0i+J8 zk~hD\kd!B)faB &jcA]2sL)F#8v`DžpMwK%|U}DdAn(|Q,Z2m( ZB15Bq-̟I%fFwf8ecG%) Y!L1RߴOd5Ƙ]ޝtiy;=[V Me[7[L'^L9ü38E~a`\O>r'۩+A԰NMc X!v58ūq §.'3!`gUd"0+bi eEPAmѱ $hȼOfuߝl& 3s_kuZi\Q!/v#=R{ll05[|8kRBLØ~/u-i^|F_Cm`Q-L qH\Y-ɨh d)ۉg/̫0I6qBy1[OG>q3ʤѩR;kdwL^ {aDC6O.J? ƊPX10caLJIBtnUlNNaQD R -JpZfVB&s-w\\X6 iqe 玂g)󄙣_S .P;!"7":O]ѴbAݑzʹ!%^ @ӢlxWgZR6ώRn-%ώRʬQKX"%cs~J,$ݿ+Nܖ9Ũ)q[FAQzMr;'r ;1%âpJݫ*ݣ 5mUS'W%SKIp(&ڋ(*qvb6 XV_@G. 9Ϝi\4J6D bL0 #Ù⑲$ԈQ#bZH2bIh:6 q1%QǶ_[I4aˈL4LDF" V̩A ccVX8‘(# 57 BJG68hmbusnf+uM , Ֆ3Jchh)3ʚPa 1# }2+N_V0T FBX(0(BATKaƜ\l)hKlua<|cwܧ 7Q/+{u!at^pɩ"}6jKK#KړnJЉ7 qZΛgY,^^en) f9L)<y6^$a4RŸb,݉77kq=x "qq޾9&\Ϳx~A_ N_Cn>#{i3,$ݺ~L%kcs1C'@.y( 늀ܢuMgiv--&h0a|AxcY`?>j+XhST6WB 0Zp4ĒAd6!TUnoct=s4bqTqk" /ŋ*)x4* bd<Gxp[ gM B#lFm2[F,н Lڋk{=M3ss Dt/o|W  )i5umؐo_Bw*@ݎ@¤9ZnMK5}!\_M`)ܹs/ubAU[VǦSA_hv]?l9k㩝|bwz&T(S,Aı qpLx7֡X6+4lGmIiHqHYu#F#~I&GP)Nj2zި|G)Vq^nǠ*GXw'G?Z2 #Ys?qRX≰ZJ/UUyV4G rPMkg?衵o$1,-gU >9(]ʷXb8rd[…6P~$M6JP;\ ŁcI`cLA@:8l$4]w8~2G0DuCZ c&ŁqZ QFiAUKǤ(Re[*1s8/Md#{?ͫz5q5 L0s K}u %$!E]O+0$AG>v.jT3.v&a vb |6ư)p"p- -9kinh}8 =;xݍqYQ{NsW.PS<ŸSNc ?y`Qճl~V`wU!Xa6>؂iMʤ\dg[oxzog/r ~^ؗ4#Y^k<ρ*n(~^xx x¯hBߙc20ҔAMxzPb6pZ58VRխpE If2NJ0FAr1`!s\^ٕā'j`h8iJϫI}W)oV]l`B4i^iNO JXeAPZ+Zj;3"1ʈc=:`6т>kRQv.H&0Zac1$ڮv]*Id.g+.l]TM7(f( G$$K`C6}?&j SJe]_5U坴?򌰐xd1F֎D4/UӺ3 {.]rU{nвl)_>=*_\{?3z;4lȚ~ sy8r}0X'a1}_;}0S AD&V>nL..P)QR{5;Wr>VȰ i|dY~RAs$ zƳ;MBbsG0IS瓙1 &K_~nQ'Q${rcfx"{ ~~.} {ߧ} a K05Y X&+"csDžҒFㄢZ: P3[mh2ϻSEy:0`L Ǿk$gg!^_~SEŘӢo}\2AChfv۽-g:MWym?iT눳FRYnvPxtK'[ kV]RȢGT* q`OP>{|=4:AR&ƅ6ߩ3xՒ 22 iN]? B ]fdYfrɄQ+aq--kPMAuZ![ uLaw hO$ 0&p̊})~hy:Am3ߊYݷӹX'$"m)JWJqJRքuRg8ׄ"7Pp!>)r ;in/T(~3w@5gPwE-VƢ2=wZ@I`Kq JNkQy0,DI'EEH[pk1]eKQH*jd1#E,~#x O|,/_|&/-89i•rV#>eA$m ? `<x} F\t+!8C+V{d-4*?^yy9SREZNwbNti%ns,i$B[F \Ai4n'zMˋ01D|a[cfy9"D6#vEDoY(\P~M<}k)n{6;칧&hм/t\BPqC 5%@1 xJ"KK s/p1|Ug`Oƅ\>kW>ovvI{|9=*t:i:v2?s{s!oC0to!4{^ƒ&o@SÖ89Ȇ1CȼPs /Foy/[s}t].,cj.*c4gs=m!UOwݍOCe`}ƙ1Xs0/~&Ww,Hj B:XO sNɗbD |V݋t0I *v}t~:~uHkCe\[Ĵ6|hwAuHg`][š#%ndD`23`*l6D`F! NAH!KH &*$x-|O`1p8}^L-׾œQ6`xUqoIHL^*L 2QT!}VrЬlNj/4$.sR[ÚH]8BDpsb -y $!yW\hףgRʒ W-$/顄J(?.=4;WN Ʈ$ I¼T&rQH''*[/6*OAfTE{q Wӽ]Ji.c'$-qæ88t@Bh)l8"u,фpVߎF*q5BQK|ZCDK츚C!ښW[]Zץr `%\wQ[Ϩ4\:Iv[3bړ]|ɆթM$lƲ"5K}u3S#۔=yьnf{lo3"1ʈc^{Fu,0mErTQ 4ZKhDh͛.wJ/Pq!ue6g+JWgA5̆rV)CM+ES[vm] 7̏k (UfmڵbšȌ|s[y3=}ȿO_I%*8' aW/SO0gsb)3l׻n:xٯ`?|ͤ^YgF=pI v4搣FE*^(cjnU2;z6޹y'K&tI:i@e\RG_do3 * j(ij1<su4+Bpud8*Bڷ5qI1H2椙Jԇas`݂zb d{OQe3]ʨahatƢSrpq :ҠPYC`\HA5UF*˜2QYYC6\5'=0E=YU}bSu!%1#i-O?w?~JC4'R91?1xR @tp:y/|l\,`Ȍ)R!)N*"A F؇ue{}6[5؏\q{_H+;C.^Dʦy/m]\8^>jvjki0*)m}VT)Po>"ԛj# +T^V]}L=ƣa !.Ua[YPLjAqC7"%rqsuݺũS3t1QJjlI5X㓷ўPz{%t]oqzrיa~s7Z -iKZÔ"XK0u x;%/HߟY+j^Ք:QĻϡznŘ-5cN&Sl °9Z89|{Ue(( ]^ps˧kӜ0z҄\jwoF XV:;_'(Xn9p}hLLuEG][unA՗6o=R9.:L2'\ WD&8%Ӈd)w޹a6<0 sp#.6FP592L ~$R! -b*x˧ox:C:@YZ{Gwű^ʚg}]ֻ={%)uƼVndk h9̇^%QW4X*`\R¡#J\'T"^ $f+:^Ny4^a[ڋGv*~jJt8Q`DL)*% ! Yw}L[J+OxcasCDl ԍaq*N% dF#r8Yy o0Ԗ[l@K} ?v~稷0yK. B "B"QUHy1AhM$ԩ@Ȱ8D!12&V:E82 tG?v> Y{x:UaUž]+3C d܂Z53_$/CqYg c% RB6j2dYd( WL.Le!L55G<`MS 3Pj%Z2Mnڌ )Aͤ,+E H0)fנԖH.3l0}}xWgoɁxC|,)heRA, &P:B¤7QUH#XjiUb&( )Pd M"0B C J)/>b/ F ͷluyfvy/^|fa à -p $?1/0;~+-V3|ݏFv NQ˧_n/)Qg\*,~oO[orirV cx:!62>DzrKu}]Kb$Kh, `TfÂss]C xKHqv?_yKSԘd[ʗFs9ZbV)NoPmy=Hp4w^g=V F9$/*o1r >Z}x~ ϳdp.Ռ]z2?pq7 oֆwיIJ w5x9*hY*bdb= l$ѓFOlٔrdjc)$sS%N7T泒ËSЬ,5!^%nl&/È@[X};hɔoJgrPڄN/6TdZp7,FrLdj/ʎu g≊|[ea Xuj2$SlTfTH驴@%RjʊJEuiP)jL23YG9lh Ǯ`%cQbVTlZ Fswo٭Ϯ]ztrۛDe4I`kwhhW+ʌX cyg6IR6Dy3(B7ޱFW aLԬU6KL`|Y)3<.KܼaΪ$aVL'#R@zxkD\ nQ sePQ{Y{4vyXwM(FJ )u"fғgKCݯ4Ap[-bͧ4r/ᴴr7X]OGfgU8J:.{D\6ikӧQj[[٨r0k与N|d:N8)|ښl%F>GtJS1duH{;\x0Yq RE{X;}8  i*b/!2(;ʈw~m V;xsZwVㄿ͹տsFN;#'Qҝ;{* T P'i[2%=m48*²9/fS.0! bfp8ADHh@r*ɐ$Uݴ0R8;v#>U%q,ܣyԉs$z?<1!{; \8S(]F>%!\2@]'wAtB'5%b;?;m)݉=+2i@)R(q7Uw)ն;Lýyssk%VNeWhZ%[42:~۫8d04iFԖmF! G "H RH26Dh1_mΑX:u(КDzs$ LBq54t)7 A+TԃT8VJw|9-)G6`+)U<r3J Rk-7iU㰃>VL/9\F2~U_~/1lEOb`kJrr%"O/们S .82%.Mi'/I$˶]{Q,@DzQa"_3+ -BLh&:7q)_ 县FĿ3F/Y@;&󊢗/AvcD/5xq3FK9A3%+ =aWD c,76AEYA""lNJW):KŧRD(6#RJ-aTWXЇb !*7#~Ѕsf\w̆ce?lt9 9X7%CEEP*sf|J훘"[ڞC̔VL{SM( "0  X YLv˵;5C5r14W,'aOF6W o8KbD@Sh *$HD0T,Mݒ+WgpyN.<\R<3;]o?]B)Ըq, SW"g%NÙwtsZOPfqiglj9m(ipb"["+uv>.e|#Y'\JXI;] _BWN<,R ծC/aZ"[Ѱx6V sE)`uњ/E*})Λ |[ʗJF [F{ԝ[KU.;p*Ԛ0/' .Ѕg\x Ik:ť mC/.GP`&h՝c=,YՀw;ga} <Ϊߩjw%Xg\ rDDʯwGkZ_+%=8B*%;EDvl&(eR5nkx .3)CjyŐ1h6ȑJq%NT!~zCZtd4[P^_^XO{tӖo??]ʤHX! (J(TqeSIKi{9&cz \ژ}xGW!ddbUFcb3A SLۘ'(&Ih y)J0d26!Vz$_h *e0AˈjI$AJ$DBE&$5 *!MxBy4%$5+#9c5&U4 Ȉ(ZF K 5Wـ`!H|̧-kQňoU0Z=݋J$ׯ߮r]Yo#G+^yE a,v7|_*D $`F:JdbUQHYZbUfd|qAٌ45{YR/=_ߞ0dAhq?`{__p2(wWW ~J;Wv2a8N.߹Ow>\mm${ G]CW;?f76 >o8Jۢ 0cR:@#rݔLfdȠ܌o/gNs 3ѫa[RԿ u0nW1h !mbﮆ}Rw9-K,_ҘBPm #+Ӑ4(Z|N ] ^ia6ŵ9ߘL냳t@'TR\ۑWT?ѻߤe7h~$C fL$c,NWRi{YoZc:s0NahrA^~NI?-&SLtȾ9r҅gyiՓ@O5t=1bi8niRY87J ezTxm=[TwzTq\WLlWEt!4 #<+"Җ`]"!h5'@t;ZOHxtP2R[!,+jr&!P؍e ߆7!6,0Cqwvb>snd=Y˲q)j˾}hɊM/p %}mW WR]2EF1p%pGBTIJI;eا`fb цVR9u~5~&S~u\zC38F '+uV ?Ԥ{L+|i-Ffln(&:t(ϷLY :Q`ǩP2$hKӂaAC)DSŔK3|_GD'0X[ʔ=ZHo$a yqRcp\r+rSr8YzҁK @ss 8#+Km@%a*mZS0h~?Ԉ6z洚OufOE> ܷO4`׬T^M"ʫy|x5T#[WǚFPB\i2|ՂN1w͞2HViZp▒"/݂l]y.@zMUR0;+7!U{(Pb NmF۬xJ - E`,Dp7Oe_d[vNk+v7ΗɂdݢD׫GrYζmv9UV }(DEcr9ކbEm6Ⱥ ą N ݭ oj Tl^ÁRx9"l\M DZ&7$5AyGԈZ]K{hU4P_Ju%YpMڕqfj uVܣy+I+ ɒNCsӭgs (Pi{}[c%Ket9!8p2>c4D(8ucg~pO1z;Ӭ9Tn9e΁}En ʥh"˘zSJi=.JbVrhr#'ODQq Yq@j & #\Ljv0EDǺFG|ֱtWq#h]* ALdDq0R!8GbJœ ^îZ̻ĺ.mu^%Np_Rz@r-reL0 }٧T6> #w-rO[fH$IVj Y7r>@Fil]Ц쉏uh#x/s݀!tT%D3_* .{ %CV@89^`hmP1KsjUh'tǧaPn dȚFdجi18~sVs "^RcA:qV3 ^)-1Bur"hR@:$% {Sq'.'+g96}M:[ˋbݶ +Vtd, /zWs5Lz^A7Ԯp!kCŶEP6qv/w"Ӻ[ժJϏTOݖo@"PiZ 5RTAa JSc*lKϖW5L>Dqxq'^N*pμɷ1.=SCu{ DڠBB kpT.E]K8-]3]YB^c4 ֶrf׊>tDti z*2+ɮzYsjW&WV[9MH |"LSi-5@7cèXbuECY*z,YeI{^F\KƤk_>&>!b[*q4a̮Uu8Zz8SV;9<$tEho6CBk(;O[NO ؒ|izg\8 Ԯi߰ь0%_+>ƈPi]wA}jMzmFH(翤܈y y) ؍ w< Ae, SP[$8N3- Vd(CRn#L^^ډ Y}tfNE]N+8fR%M#f2s`YqEi߀0NP?^Hyg:#ץڡ3Рv8Tlio" u3UZr3Vb)o@QLߙ -&Mƃځ hG҅ޡ Ιݕ#K{w5-)J1,ԋ(HRY*'qS/O V,-#%RQN))FxK06@qET/BҶdrg1x) 4^jeB@+ b^ǧ) ճ2%Ζq kc֠ŐRj A,/A8K[+M/Taq;n,1愌:Ia)c48w2-pp̞y63J6VKI8󙮹=O0F)ߔڷ?O+z&ǧg~|hLJJ<~}Iɬ/~ v{}{vhѠOk/Np<给৴}y?se'2{ft7=?==>rcN75dNٍ>qQAd ]4)=pJQR%Yj*ۮּTP#xvx5QL\> ɿ?Iѿ'.ur:kp-t'x<%9OGSjPF,^ JgZ{ƙ_1ab-޾۠]l"gkig(9|%rmQe>3ΐCJQC&\d"yZs+(6J(qW.j>eE}4^F0=ݾ[TU4BI7@35|i_gr;K5b\2m &|s(Clb.{~_z|ܿ%@ 9XqpQ}p/JL*0WۅcURߜwyCG q*@k{!7!r[מ@N99fxezBBscoFg/5@|Co3NWPS Kdx+s_cyorˬAj{uJV(ɺ9ZiUFS_V+ZTQ=:[Ϝmfxph]bfWͮ]Za:7Rum~@NޫlyOe6B+793T"^!XlU渂u,Hdskfߚf 0VJrj3Gr[|:#3INv8D)#8]RvB.!e!IFYaRрp+IEA8%"of&%65TqZkFkZM'ﳓ1 $R=X*m=zQIz 3FV("J0[0TBXri?Z@ͰY7]e+t ]D[0IRkk85؁$Q=$qH[:]pE$)2xTf`d*Ed1x0 r0Jɭ,vʫjҌ(V"=֝5B׃?xV5Ƕ߸60kvBԥDq˂4zeQI(3(%VzneK!X*T+)xոj[}+Dy ^=\\+pOLL:4[#N}S5sVg_B>&= :sSe >tgSpU('# O\XQvqMzyvsi`OwvTzT̈eϖ}.8̓ߓ|"ZI<zUIŽAUŠTmv{<+4wfnVE>$;ѝeJJsUv<+ :W7 _/FXSbDHđF˘"Bk;"G4&I,&ƲIՆ*mTm|g1֒ a%)b&g/߼,'+k»/$iܰO2ڽ{﷿LQLD3"7o/ONZK\zafb[J)aF"b`(`1r(hMoJA?M[m AV:SneM%dS<&)SB{Db4 V`) 'Q ` zEH[>6|U*5"&Hm85ef+42>/v8ك [cBx`*:*x\^kə/?$I R(aB 0:xc2"G7P<񧜭\mpز6@בBz4gDw z©ݩń/$׷"TIj 6ɂd6'+ByD,if-cia]DF.;E\n#$~+u7+~6%JMuԸJ9V~y_. C)ĔnJRlh|@lf2i녫*$wj*fC%;a犐uq\cKZREzƼ̕dG{eq8Z0\1 L|dߍ!8vcK0|eǒ;&?~m%RA ׫ 8m<%@${? r!G\d&i/zބɰ1 ϡӂ $ˆ qQl;JQ滧͙1s1<*8ːDk#KsH(‘[O- k,AdZҧ[O\+&4HPw8=MĊ hL#CaGN ˥`REK͉xjD+DwZ>6k37]ly_͖o C[(IΙ,tPp\#P.<#q³Eﭗ@ZM3+ϑ-% NA3qMD%GF ď>R*x 9 R1{"8%)NE)C$F蘀A8fFD[+(f(Rt ,<-1V)z jC!Wm|w1`C oejЅ?_u(/ p55d@/+E_HƄcGy%^[MpbfXyl@=1Ҁ)8 r"K-+s)D.SDx!?gfMo|u[2w%5yK)W{}}01H :uAu2[&N;Q.OrhY*ZK契[#ڊ1Z@ =|^M;/X׵:Uv D ~ %f 7JxBD$@=_ N@CS7~bz YG:R HтQذwRnܽ\]d)E $lmbYn]Q׸f (ט+EeHx^$x00Z*bF#LoѐBidZڸS؆f.THskCa1,e0j%vn߮^W3DT؏AT;Ӓ?F̎(YAY %g!s;TܩFʶF_]8Og,-}#1MA܊Y]!?7T<}>Cc- eB؁ VDzmSbk8srT;ތ̠ ]hi^[I9MSn&c_WZ-,K0;tb fc4Sw8d.m{ݕMIէr-:!d%3d)d`AK_pMqf.0m6S;|1]I2:Mn?+|ttlW~s/;41<#%<[Bܯd~B) Fx`·i7-ݷ&kZҹ=|H"ܽ^wKO+_L:ty.4g[[@]s<`Ӿn;7B!/|ԏmoŘ}l(-{7 ew{^~ y;%_ Eޙ;M"q{$ T6f%cK(2e3JI(ŃX]aN BQ)J7.izE}-mx2#x=`y׵M7w"@%wt rkޯh]u_BG>OSO.F;w4X[: Q'+GA̿*4 j-zwbJ~w !2%AI6ÙbM珚{~2'W/[;DOA|5_&#b?Y{և=~Ͻ1]6pt<-\+H:&A4^y秥w/Dtß;?7`t|M-L`ހu]6^'_5YWMϓ0 ~ +<j,ˬ5lOVӺ^aZȦ:G_<%aPC(w>oc֩N?#oÇ~+3{"|ʎ_hVƊ;A-ͭ]h<= rfRyf˄7ҎL3~ỽ5Q ]؛XdZL(9h$|^ M isח1Y "5) 9ܤ[Ԙg;NR$] t3\YXC*V~eb"ZD$Jz~.5 RosOAH.!0G[2aF}~_oI\v ȨՏGYxl-W~V{ܨ 63 }G *;&uJnr쒻-?H.Knѝz̰J].%=-6r%h%l$X5-TT/ppKk ?x3:|鯏Af0ص~M/]~Mde?/f#Unz(;G%'Z>2*C~[1Ͱ$X$Aғ &+b"-JK1V(ejewt{fJVT؝j7:1ʂB;(, TYp>WȪF8 夌I%RD\qeb\EU`VK3%a!j-::/$F[, Hh"Ǎ#I^fw"L~0,bÂg^lW@}}#lxT3d%bUqdő142E}A9 y'F(wxVI019Q9a+"酥`BFYh¢v0*,lXkM1i&ê}BBJOǃ?!)[w:(HNgw)V)vOkٵw\{GH9a-83ga'ƼL[<:=$E4 G ΉixB(#wa'5Ȃ@`i>F1 t=9Q!^iOF^E0ZtH8tF~9 aWC!Cl@foSj.paL$ A3 D&!r\;tQnxMi ɷ-or:= W@EEj㊫0ɐq#x*^YSh#K L΂r X'12@aRPh4&@ 0선9]\̄Y˿>O^DCLr%g ,19`ȀqUEpLISpI &QP5z Eiv 7B+MBC0;-ǦDQLs|Y)ftSJ6 қc_r"F8M na֡EД9/pP "2D*1Iԝ^S^"J FeW>=`w| h#:ALJF{3xd*GKBUh=4@(C9=40 MvV M \Kxg?{6zC47KHᕷFq}m46Xhj)]Des,Lk[vU Mҡ;ÞaF4*e =0268 GEqMR-62.f/e䉲PIR*K7+E6Hh'8;ݳ8j422qdbQ&,N3v:t!@yuU*MI1|cqa%Y38g{A0GN (JHjQv$0RS4Z٠xxO}8. Ot` ":0(ȒJځd@fy4JI,J-O9%3daM.F;h1ܭ%DM4 XcG&X2$ VqitEi}X>9fkVSuY*eyT5S3`A~S|SFZR쒔w3& 뢒|{q{ӧlqV PvW{Tn;ilO,nkjSW3TNXYen\fŧTr˨4NޜFzfw@uBsߘn4U;ahzbe\r^1r>E(F;g"2C Ǝ gZz{}?' M8*scPGgӃcjrH1n*&g.oGhv f4pJ?zo1ppzbosds0vutc+7ZN˃?¬%߫8 adif 8~(=IT:vx0Z?SP=mR(1"946( n?](SoQ-[n_Kt7;J;Fݽ U<v;YmBw>[,nƦc}|Cί]*mq :dL]];q~N]& 8;zTwI'I?nS*AsjUn0pcˌr/Մ󬭏}.Aʚ>e pr8ːeÓ& Fg&"&ߥ>1Ur<+_MlKݫ;VފuT(=0O% RUSIP~?Gshױ <|\e/QYQ*;ϴqڃarJlʬa6" o>Z1U elL[&d__6YorGT-Ƴ?R!1YT-s8h}adbDh>v[~2_j1mFPd>Gb1Tŵ֢lL.P%帓NRIQһ__GbxfLA4 zI4?2ᨩtE Y w":uݡ}ӴYfp6S56;ƶg|b.?lh}M ~퍕"ߝB\$z^tڮ^U^d]\ z^l`h{рTke~d[]=:3W+|Jg aFnzo35 hi~DNkqP-*Aǔ|҆/_A'wNm>Ad5)&`_D.}/m4kyu{SST*|}~ej Z*C;ZSCX<,y|κ\>d7F<ǒf*t+P%:fӊՙGs^cq](3yaf J)*+dd^ TD Ea0ޤIGUs P(єavmfp-zT)72eYX2405J(" J6ƑH4]b@'$ \J{&rDEr,W5ӷIILco3|*c]c[KNK2ob4B\s?*~*M#N/o'`;U,SG"\hRZH('DK-V>͸.mΰ~B!5r(MJ+0Co&bfFK ~}k}-WAFn A+SӢH.בt!5 +ʋk#/dɣ03&(X"!.*+l>I8)$`|gF0`42q <]R"h+WTpR=8 cV:cEa22& a9z(UBPa0}-Ǵ^B=ʣqJA[;V!ݧ+?o SigL"5 Xyy"58VNZ3~GI8#bt'4|>'=^J~eͰeĶe[&nDz.Ogm3]sʐrF8{S\3귳>(+ VH ^0WWN Y&H)A GJGde}n!5,6~m Z= {(SCLcqLcoBCtZ,l5^zL;^ RaopYJ3 Tܙ֘ WvL ΰap1| x %p%_N97yV8Ğ_>ipH_-fl|˘p3 9zW^MĤՒ R Hסh^M:ͺ.-\ YbFڀ=p6֡AKsvt|;BC/' X( p ~zt^'֦0vM9al> ji&gѤsK|ǐ-0ѼBA11ݏ j\[n~ egx&P ʤizݣe.z'eAHLPPXfR1('Gό;?ijSWaq *͓ ݀xAs5ck-?`(^jgDTJ_09B1kS3B&tĮ2AcդTC%l{tQB =SOFtAH;m=L@MK4Utxlrr1$*M5Mq~{Heo`W:z\򯥿Ưܰ4LckAUW?oW*!&߇E+thT TR^jҞM?osodbey-'n\e6=I.dJێQ$[W rD3hSChڭ{Lև|"^Qd߾}{rLeX\<16/ulK{xBèn_ːzvovl{4arxbv\ߺi+vi}wMX\_ݥ\TLo'T߷ck8”J?7DgoA@ ;:*Mq qY_N2U53J&I/I@kcK^Iv2~lS)IQ"h|ht}fWhbk(;}D/Jq@tyy#gxaG1LE!M%Z3˒g=4q{suk]wsV~;?b|U>ˏқ>!yȰKJDgEdYNfT5lnuلa2g7lz 2nI~>8RgN}Lh L&Njxn,RGPEG?3J,6/]OB80W=NG *Ut9"x90<1Iݧ0[QJ|Y iG8YsEPWF.ؕJ3-Z}뗮~,j EZZ[t_yކ3#MZfY#ֺ̟U^؉m~WkЦk~p;hM NN8g"?x;he KXs쏑]lBM9QbA/d69 GqZհ]#L(e:\Sp8'@eϭyt:{gJpHqrt\ N D"b*Y!a!I!IU"ǹτgH )(2i1#9[AJՁa='Gd&r΋CNݭ3h:)2zp{w4̾գ]ݎӫW>u*j'TNeVRLdq\jB}ccP4cπG<y2 qTb8Za S[s#QE` oXڄTL5U/IDLQEI{ɹ{2Pܤop2)SZ/m;{|lв2$uNb%"DYMncHeIO<.w2ܭ5%)GǎM˧}'_*LU1 슁8B[ MVq)Vť ;Ly%(fӌut6,,K*weiB%FZQc{%~"`!5ńQHe>g;"@$mm.{A5{py7 K²'?{+ĹZWm h;,ܕH IP_PKU>qFϦ T 8la.tX`';3O_i)EeDrn v~8JINRa 6pQ*^2'"-wd7+N(Z@lQ}x >OE+H0u)0YF1-eYbx#&23TĐ]ZbM@ &UٳBÜ2շeZ<>YgDoMę1NQ]^}'CxNS5y p>P=L6]Җw͗nؑ:m<{+51y kB ԳuU?f' ˷П=:{%&W ('IkR.#=}n+Gmjg% %bKpkU{۹-ƴQE hYMVp5@2AKub$Rts S!䅻!Ҙ\bթC}| +>O@v#Oy&"Nj@=զ{q<<]sSspmЂsn;RN?=swω_r0KqT:IH5E!C"M8JV.)<j;|@?֛}8)WM0]Hq)v~ᖏ䆸D\)p-w8>oC s?s&`r0_OάWu S!yRt"VIxzW`:_}zVb=WIK=ҎE-)C2x"tYp>vXPBLq5+(N*%+ƷqͶ,u)&4MTD]&LD ]fz鴝L +IfF 2ܙ c)0b)=IgYɥ5aBc"-B3,RgesKA+$D]QP&ԋp'~`˞оwe#r.=-)B*+\я;3el~(拕|p{;|z{wЛܚk ~~7V? ׾µp,\,4%%#2 :5ׂ 3dabr-X;Lk͘ѿ|>k l / k(v7#߶a>bR~Ύ\1l(-n\iNjF~rR>Յ@H}e S`)7V.} K+q흎kt\{t-}SGBJ9g"JJ]@` H5`a!r,1DL=xG#?Xl.{qZCB;BE,E  MJ3#c$hYG0is'͖=:mM{qDAI>%ݓf !i[fxp,1@$$tRg$V)@S kk0*:f˝e Tg[h@%js !r'IgY( z__4>M԰)k$ 3 ާ}As)G)'" RASRD$j67=SX(Y;f\<͕`ڈycأET&!&,)7J4{"1וN?M:- 1!/X wyQʍG }X&M0^2{^1&yH׌1dȵRu-bŝ!:};5YΓ͂yzp1V㽉 OTtn˖|VF+M%au9f2٤Ɠg*Phਖe#grqޙ6.-*G!1lېe[ţ3Y(/Y{_q!Yr3 j:M05f>{g{_۩c?8:DhLr:A ,5۹Mk:$t]MUs &=Kbb =F3NĈoz>%5z6VN?KH._R蜢 ؟OmQP>Z>[( X?lpk_1fx7(>\EK۩r\ x:gʮ ZU{`n_O7lL/Kɟ>E#!_Fd;Զv#jX BD'v&VBĨkڐ/\D7dJʔ` tұ*9Kd̥Z2*-c!Y9NJeHsB5v+pzWrFлMSŨ * g\o/:Ь%EiTddRa…@n% aΙX2\JgHȐbЎ B isQ$3T+ V:;Sa0a͉Si8Mִ˜*gTI08Iud9|9^l%!93,1רaΤ.XR_I"ЋD{qu׃)^e@R~nB(XQ)4&7(|ݸ¯ m.U4=|HֶK4^sR]`Sf4Isk؅$QZP/Z4J]ƸÒ5Bcn^ʼn7to\MOnj>F0x/O~2?[*Ij U4˩|_-EQTh/Q3[ }6[Ɣm.LQ"H1P&2XJZBڱ,f$F)0:5qXXmjx^2ABz*\Hq#S{%/G $9ٝ-:+:b߂U* G\80h`5>yᜁ&|dcT4UDB nKPl 4| ވl+ {&7.n>78׀`Yv22`^wwXM_Gv<{ٲQ|}Wfvu;NVC_y1(jj5f L{K9B!ɰ L@uEם d(҇2F㔵~O}}?I]=}Hy y xlv z.Rwc]m6˜l{?L !tw+u;QI]fO]&HjIU;Vvܒmf`NU+ƌR p +=)Xn# 6EU`+`7!%CvF,T=~(~dw_-58~F8T,m-c{8DU=zH{Ŕ~N%(͔b+,ѣڧ&10GmԞ;Սخټ 肶KP(@ (t{Mͭ%8ZA%E~v˽p1{n:f`ն=g\h[]Ee+֓s,+оڼROX K;ox^ &9~YB{Uޠ>Ig6^ uń4}ɒzb@Y>'p3RҾ֓PIy?R~^ZzHWki q+ހ$D0q.K ,= J Biڣw},~\U}?Ye~h؁?[F94 ʓ %ZkbRV1&pSdP8&cH)zAE`A6Һ\O  kTyGv5ݻBܗdm ջzs\*~mY/W?J>={SpQK{w۵0 Yop{QxF& !<N7M1sO ͷξ&8>k+=It4T2pŠдBx23SB{Kv!P\K)%A5x"-iΩLa:+JvF4V=AhʟW5)-dz?^ g@Q5Ջ Jѷ/Rb}2V/{(@%즳Փr&{)eZ7,# hڮ}~L>'mo-FWj4\RFV Mաjn.7Ҿ;@юCNJ ܥ$q/8 *Zeq$}^~7/qT6gk]APqSLJD`IVFsAcYD^Jz+j& kq4 QLAYS>fg)k{m:/l[Bo^䉗p~Ր:>z|OQ'f} o/\)A׼٧/QsAxf5zj bIO_NR+Mߊ|4{-gOd.Qw?<)*c)q;T*ҼTm06+XWvlJkw-Tjςl[ ;tG}'S#z͊!r嗨A(BL† *VQ(L$)B8S hRVFT .i"I"NE"H.`)y)j\ + zɡu͍9xPDzyvBCA&pA8H`ElϝdZ@vzqӦLu O@% y*JÛgs doUDE(*juiY8!ȚniY>O+#L@}>O:Ck_)cTu+wM˗R7|/yܥlZ6S%VMϳL/~vpс`@TOhqj#yw1;o.w8|N:2.>\ Yf}7{|G͓ORm- nJ|8 9u;<ކ2*uv <4]F챽gߴ؀"yPIqi.srr|h)hq,&ZĦ gzd%y4raϖWMIPF$Nϝ*[o$ jBUΰD"6|IZoL```=6ch-)(((Y(3:B<N&pvʥdm(X$.猌**DJaHC*Eˏ=CE&w6k2 IK !D4lLJ A&邸$o8gzvp_{`?A=hЂ9M61hlN(+ hq[_V"E >E&EelPв{DaHK|ݩǼ:I% GZٽUD@0JZ9VRn74C*Q`QLq&qTIG6vQ"^+߼ìp|WumP 3eLؙ2>3Y7vt~ݙt|ILn3`Z˜'-XwpITKGZۨA-Dbt:&8e\pTΑ-Qٝj^窚=*\o}]͒!/ rM_û]2e&@\RsmJ& gHimp}=~c}gG3|`]%$^劐8nRPh :PA`*{W畡<*#{]4Nm?o./I hmB"#+t$FLC U^S\rz>؇鼮8ih`yD?M&wa X J7ULfBʠ%0rT%.R‰JZEV㏨J U*h9J6(y_a/]Ԗw_ܵ^9Zjx;QewK8PA 5;MӜ4eถj+E%-'{<>%Sa[xBx7I0}q|=:/)/W]T>Rr TNwz3N0TZIκ^\6erLEqD/pP$F!C>i$"JPGِ4\ i?o}Acw(trWʅHB4rFPRpf͚@P h,euDtƠ*tނhٖ|egR*֡R"Pe׸ PD@0IxiCQPzy&0@ 52h3Ѻ6"tf)]ZMDIYkrc-ɻ_'guut5 Sq mBN$P/" ׍Age0RE9pJYZwTk\z@#>Qy{%8*pHZE|4.3(qAK 1 +i[]zt X.qAzd1zD[ةYHr j`hׇ}ФB*CuxFikv&x*\F(JF9LLIsPЙ7gi49ew-H-m6)h ]X&PGd7]_M2rZ4X5ũlac1B@EfڶfCm$+?Xr߇u<63 dii#YQrդJ):KTUuuUwu!&wr  :&w;vhw|)A}LƬJPPa`)Uހ!l~Ѩly"b*++߿&v$#fB^ J ĜgJ5dWP> ([<* p٨{(8?29=qq !aJ@VI*yҲW}{eW"S3*CWAf$F}7>WKΟeL+2Vjʳ>' b+a{sZZw&ދ ,MA(bRX2.f #FbHɞhSZS(xx3-߲Dteq &*)Α@4 ݬ5ggE`jnJGBA݂lӮ:_IZZ_~K>]ͤU TgR5IZrMT驋B9E'1ld5SRԠ^ox{C1hh3!6v> cT %KY:4kE+UT geuiJOSI#Mln5dǪD8;(e!/nڽHbp0py{y[y֭Kv>/Ql%L \:V/~IIGkBwC,[M[[ -Ihtj]M{V[.vM&1^Յc-=2M24"Y~̷7&քàmfNmK0&YudJ {r9T@'S7 }H'TGBkh*)P>C8W@,Ծ0`B<+薦h-f(F”֘V8F(CTߕsS W #[x,5etq=vn$BaFj?sfǬw o R C8oR0x:x*eEÛqM/t2~=y*1"UϨ#CSGD*Ш?ҳc{YVT)C$eF&`1(Nf'4(g90 OW45SuNXHWE=׌UA!r +/ Ӆ 6Mp*d|4qfoCiԭ ek5?V(axodϓ"` q\(wz ^/Pa)pQ6+ P*\qF"lP=ԋ!Y!g`bNE ez=^E-w]V'D&U4x*z8/Yb_q`c9X|3@-n8 |TՔy⍉,Q,S>5g aC,x;i* =^H-Ճ'j:CSKDg/' 1KGgfp05X &9)=aĖ>5y#I;1锹 fa"j.a/P'(y$;Tr'%!*HU*PğP>s|+^4.E=N膁/;*t*|G1KC\H0*+O%͝)ծy6JlB{y,!޿IUsiF`>>µ8.Cn-賽skwtrI: zo1D09 ≩wG}xq 73hs B7bh``U}7~X'pKc;ƭ֦Eڼ ҩ+oXKZ|2JWrVB"FmʸXVk8{qL*`Ry\()5KXe!m㈉tUkCZIk|.zVVsAs(85$,lvރXe4Htj%Gc{uS9Y(x]oܥZT L q)bB8_DIf):0nKfgS|L\ 6!0#{ȇ147&~j}e "Y3[H(7:3'A_ >yg Ov5e׾ HI0dyY|13P1#ei$k'ғ )VF#!߹V)-:4L Vѩ"p SHkڭrGm[h%+|FYGݪb`#:UQDnދ[nCy-[h%J2JvcwvXlOӇݪvvABs-/S\TN,biu@pKT? $']T[pZsA%g\?F@Cr?sh m9zmN^'<=>s9#$a]#$!FKMUX<`4?C_aN6 )V7lh9uQ(,U|쪥_W1IQĦY \US!a"AtJ\d)1C5M,0UmGJtKŢ֪O/nz0z1Bn+IwRT+uLP;2 sH])G~$3#dfΨAC*9T_rDϤeTM5pzA݋F:c;ݪyg,i8XPZ("V l[(O%ER|pLJF2Z8ɕQF&Wa:¡PPA0H{̗p{\3KޞwsK+yt~vzƛR\.W>?cW]-a 9J-V)q& #]\#+stp*ZJP 9FgwޝDZk9i(wE]2vVݨbiK= s?kcMԣ &Վ֜r]+z!{w[kY>~<@L߁L,!&}ִ+-*Ӱ J ]"hK9Dp$& '))XIv9W9L4Ub}=fB4{ӘqF0?:657Qt<\}^"|5ðwpW z+%Xj ٠ jISP'R\. _6OW˾LSD+%lԹΐ(0!`buALrPK(Q@G^@<hZ42؉08WR>2n\RޤfWa;R0L\A},HI_3ιy9^OjL$e{JMq9#p+{ B9.Lׁv=v"2KMRSi7~SYb)mB_Ftf#5&M-Fbd1.?tFGsozQ`G}ǨooYsU. j$G 1Te A o{XA++XIPXB'\!ß*]R~  p\P"@(MT Xo!f(*LtW kYHUd16s$X[BrdDكG=~ &LI)N3GlY{Ds${9ϋX $<r@8r>98W<h~8.(Uᾊ+XRN^eWjTv9%/-}pز6H5Qid L=(@LKNz16vn0hdb<lK[:$#7Q2:{Njh8n&>]C[bu-f3[{\ʑuҹA4Z1'~4cF7c=j ;d:u(Ng7_[+N>O^ gg''9aM\̾,ĝxϿrկ'qn[p+l B(&YAS{!'=a8  ~ 'N^3ptS&W^,)Ё[^y_/o.o> sz"d)t>6{h}b__]^4@$†+`Gw_ZSF۾+l Y`F_^淗ě@\9ñd"0 R_?oى1^ݎBg'qSt$3Qc~rK_K6Lx&fƛW4 t$=Jx/hFdlӟ-3oi^# 0.+Ei Sh(OdjY+߾'#bS sƘ>.7ɗ//iQzBʠH_'xu.rvsqx:)C;zgh't!Nd\-Ua0}Wd:ұIIcG0Ap:oǻ*\YC8v m`ο{b6O&2$2 jxp]t)z]u9 5Xc{@:Z,TBJh-B_"'H:NP",g A9+YYsi}&U복2* NW7t%߿u<;O+m.]c5cag\1v6òUͩ4晢Aָ aiI<*`@x\3a:~QJ s$"*kSF.^xn%cg#+FEb.P1PwǛi0.C.)oEEM*N*ƒKW+fse5}5yjgaUXRLP[< ]|ϲU}exk <G뚽k:7xkn=D}{ZKG47e :?Q0"Zt]2)F;9 /M5WkD%=o6^W˄6H'j]&,1 &鰁7w㈦Xl.C* km]jx;: 9moUR7m$jvO9 f:ߗΥ'hx'zq q!BġgS|մupJ^sz9i:/ɨkG1ө@np46RG_pH!yGզ0e`GV~_i񆧓y:pqU2Pc@(~b8@s<,1`{!FN wA$wBeK*g@5EmWJ`5fʐV~T$}ZiYlO-_h]@7ε7]3a1Cj?+[9t+I&uU:]ݤ} x30ph[+ۻ_Ç|JHP7_N!\2-4=~+^aqz$ &Vo!3>ԕٕmн;xPK"6# +jaY݊z2MH:W8%\ybJ5}MMZf. wS}2īgQ5oI)GWA´LK{)'پ*TuآC7DmxՊZX8*e!<++wT;y6H݉1b6'7eBpB 'g0_C7åpbŠ1%zl 5;3CϳAr \6W\^QN@&'4Gߪ7W}k{u*zߺ7zgހ.\fk߮`F߭S*ޝ6F!E鏎J%>f{>1G3Tu XֿvGr`[Ay'/uyÊ&zy/M> L瓗CmϴJ~IP__̒*.rȠRjե,\OBD$5xPaRaRaRa]}go@'`F:ʴ (x,p2F'—ZmЃ|mgvj zہÚ #]L2Tc&aezCy{ B yn̅l&IJFIi X‚dkhH2AJ9y|\Dql tq^a;lej1)PcCaAХ2zre)'8; F D%UF35l㦖҆)}hS7?nӒbtY2T {hCػԟ^2ysRo #LO}sFn<1ô!gwxwwwnfr?%Q2)Q=~J@CĪe}K\R&T]04ԱLZYߏ=޿M e_i: 7O+W2˄.dfνp]cիL4տOK;>Z0{ PJ%/ (p}7: Tq\PS2B!T[ML9 =Ej:YfGxW)YQSMйcWVȫ;,fgZkkyL A(2"˕)25}ͺ)+ Qu]}$әa0WWt 0Z[O &mnccB9 PL^c}DFg"OP5R'\j'8[| {E VZ[Y0EG+0tE)KF0O|I.%,C*xqA0h m0y9ysSה 04c~jM1\ԥT]j"`ZPnM7/7|j*X >󛷍Ӛ}D9M&P n?<$]r>UFQ׬S9ZKۻa0}]cAqC7[_argE^,}XsFXzރR%;d\|vAZY U:@O¤ !hF+xƠ]cQ@YԽqş5s`1OŘeVq2$zU3λ8 >0e=Pz[?D_iE0wzvANdyypK0TjZo[ܝܧkyq8zyzehءWp1qPN]Uf p)9rŊ1\uƳq_!z\6CŽ`am"<빡4ٻ s9zM#J=;^U` 1Bt~EkaU`g=-R9֫w )FP}a c`P)B4)6ABPq܄j3˪٥!Rr!hh"fq;`KJmN_ gYcSKutTCʔqKYB+7Gq [̡Z7g gv|^==xG2s*$KϜ8|XA N*'z-"G)#C? *IhʃA+D pCX1Ʋݵ5Ƶn#C`1ͷCTb|ui [\˻K%ކϗ>| ѿ[N]SND+ 6P~+&Lpa< >YI~QaQa=G}Bx V D!h5L20 \%g6BP@-oi>ga`y9Tk!vyՄu @=ܣ Pj"rBxY|c D첬EJd`":h) tؽx( 3L%a$0KͅG#J S\tQXRf.hZ䈴@{@E7i\eP!NݘJ*S%ad;LlJτss0!ը0J(R8Gn.JSh +],_[N[kJ{\Db\hB4Te4FVtۈ)W^VT_0AR+!3oAhQ3D ΎC% !ԭVj)Vx4! Q!̖ZD<"z) X܏RE|Zh/xuon}$`V'w_[%-骄Ii-kQQs& 8YWQ1If^S(_z>O'-27D:\WEM*kx|iteXd cƘVLaT1 x~sb OXRy&S+:iBɾs9}"J9*E$܍U1ѫx/>ǴׂZM(ҷVrYwz-Xb󖔳rSh(Y!JDN `lu|yw/-Н@>mnQ}[Eo%sp*ӆB^jM7MUc |Fa6l%.qJ⏎j\dY&/{4bә }(<-;@+8 0)뫭1Z :yHju#}Lʘz &;@f`hs)PJw -^=ƑA}cиpN #e-O7-yW %;22A9^j^r1]}t?Vmww<6Gy =9zssf"*:8{\MtUz{ n{k'O^}w.ݫЯRaV%W7P8ػԥZx(.ݚIJ # "h&%(!ȤJ.5@(RKQD.G~: d!H#! RE Lkc/$JZb FE QIDiDd(&c]ja=Eg+0(ZGPzEGt SZAc|c6¹YiL)xIK")hʭPLMrSW;{3~9n嗒d'נdH $HrT؁ӍFM1m.9[+A_3 "DvKɆm,-#keqˑ%M|ZvI9PcYXPB #U %2ERCqz7$J™ζ^$*KU.|‡.|~(JU %k3I&-g E3kcT+*N-<R/*= zza-1k?z`H7_ΊvLܗ} F8MJj/Fv=Fv4\cyhl 1`Ҁw7VKG!FƝZUu-8:5A1aLߺڭ`Xe ̐8s:T&2aR"ͲTIEsX5uSGrz,x晰Jp)(hs,=I>(^C*85E.KeE#t&}]̄FI"5 f*CZ3"Y.]B:!5N9$S-B*KCs2rgE! } |CéBRzZ~ JOѹs4\Bˮ< R)S:' ɐDeL'Fܘ̤Fٰ4 N29ɁLK`,'HReT9+m J]kr:(4/uߋY{AIz|n֞]isÑ Afg0*V8b`w?w8R tZ-HMH* R2< &-d ~EANc"ca]=[PN<+̻qJJl R%`r+*:>vyExCt}sܽ %-MQF%u~I&wrk ,~ԭ ܯbwήyut68( n:ϛ}*(N~_n]Û,ad a7 X|=83x[T:F|Fen]vl{oa}̺_ ~ߑxHN/^#C.1{G#Wa?j$_ɗ"M~> bx]7Dm$G[EQ}5v/'rϿ=gqyVp*7^qf:zvSsM4=gCJUEMN3}Ɯ3 ^v{L+48rڀN,ƌ!J}R_2 & p؞uWKG.brQESݛI7^ց%eYiUtb0Kp}g/ jrXI;!(gÆzh0VE8!|RGCQ^:lx$Д|{v; 9ƈpfŶM+hhTcЭ{cyٰ0hYȐñјaw'6, +xq6Cq SE Q\LO%ӈb؅*x 7Ƃcb!)1E!Qq#U}`d{)#IVsU?x{˹.\ndvuUZ^/ͧiSHbX!ʓԨ,A ~lFa{jedeB2un$,`# 8]I#I,D)X<NSڙHsZ\9XւhN,G-,(ᔐ bR91HahV{:rWW$ܯb$5x -<Z ~Zjc4v}t@"Z8V| PuM!T*ZCES8T5>[QA@9iP=XsܰY;koCD )=/_.l_Dޠݤ$ȟ|5GX,U3ΔW/}1e=yZ0M}H ߅e/[鹣VWn1O;y`yǘ!WikFlAaǠM0(hp]035DqkYj4bpn=$~ ;BT3:5șGD@{(V^=z&P|#*WTXh":*?yu0rx00e ixpgxw_xq1#ڎ}:ڮbꮂJqp|wOcN-IeYvQ( ]n ĄM CX lKqNP046y\;%Ra#e:(配"[K .|]}p[Nӓ(eq69:5Քj-pwww=YBiTSG2Y_~,Hi?jԖT3`0x/$?ӤD)1_\s70qelP O!]6bOm2@cVB=aK>kwӏ~@--˛bȉr"YJ$EajU( &Q'PJvbERv:TYJj;`QbxH;\*;ݙ®Pב:*إ:eWXإbu K#%º KiKh*,*.UQ]/Qb>٪&TR˩U.ŜR~iG+4§_9>#ؠ˅dC]|-srV'&CMk,JNYr m,qKuO:j4/ţg2CK]eYCc9x6N )cD2!J.;lB>g,wL:Ocm'޼1lJY!)u4@Jb3H!YSߧ_0Xd+ձW>-fW/9|:ϥuÍmj0D%vP&3*%saw<^G+#WvY?Zu0F4O ¾xdH[&|_xSʇÿmwϔy/bH嘇a_aHsC +.ӏp4(3N`JִXB]P5^L[~5&7kyXE_͛t.ż't_7M:DPJhN?էLQ5DS1%&Rv UkZb_ 3E ` VGѩA[!ZuzI p]߃ƗXȯG} HčDDXԀ!> [ߕKn%;An݇Q5ѩ$JQpw$.?-V3[0eb2 c3)=+*^u.g~ɖw,I5>U/;H' bR yV~u8嘂/M@V)/T5b -C[@=6F|K-VUGGj"=_!RxL@s.X'{͑$JB4aIjT FTjaۅ\9Xw\ e "Elns.R bR/  93{@ `Xd.7U2Y)`W=XhlEk%,. sg`MnwWſ vU /$ߙo>,}y[vkxK,LIc{] =+~zu\y&<̖Ne,r)E”IX8/e&s,6=a9º72/aX4BfT=}H u>a1㇟>\|~>L3l>[}1\M`N>kvwu|~:LDfEJl`CWlJ 9 eѵ_? qpnIQO@mLpaⱽ@q( ui]'XjпOE6fe6ق|Фw6Mspm1#g^ {QΈ<L$˘Ȍ(d78m[K%GH)e6Zg3U=Wna-Aj% C tux57Y\饟gY KQKNqRVE~pؽEe'A1NMvL`nͮŖk[#|+T€! Oݣ}jSS:(4#+$7E0 #x4O>3_f٭F,)UĬ<*TYd0dG6^ɹkFg7''1E žG!IX *+Ggro{so腵ycUmL%"DmE"K%+JQJ#EDzi]^*a%rQd!6yELVmeMEhZڕnصNlFoZ5zjvLJ{)ю6q &no{L\0Jp=8}UM P}{v-QhTٖr%yK;Mu YS"x弍& HZGZ7j0Hci5E 97 W tf{J7vN%R!'vTCks0kh=qg9yjHܒ7 dڸj&5!CwL\sk>psQFnTEou˯7~"[ڕ۴AIy)A^[{oэd)%qm?Qqǡ콼^lꗸ̄`Kh1r ,PZBpOhXϫyrXj*ZXRcɺp ]2?j!Ut*љbZ5<\ I&m =U.5k_Bi;UГr3_ C}V:E2ki>kU6!|,QL!&OXͦKfxAj+}6=V 2{I;2Z՞q1&Gc5G^ c Ho!kB-~&)5V5b)}W&̫vǼtꅀE;PGyķ7AoSX~xi_]nXݜ󍀛dk-O0>RLhK媌7Ս㸡Q0IJa'%G!G&l8'ܜk_4n&`UmS|Un۟#zDց ҟ\]3hdwEv Qr4 gIIh^o{(;f~>GrӇ%:U\V! Bt՘U74ޱ T=< {c'Xaf`S|kvtnjSӐ̌" uE YPh9eMVIj9.MjEF;Es2ZNՑH-5'fRN,VS*mXNRRq FNʏ v GO*:Qј (&fVz3j@p˨(c#gx; Iz|رSZ{pL`ɴ73nNֹAy+C!y^$7o:WilM 缣ၨ9d.'cTDP7(ItRv>firFgMnؙ\nuAj7nWmwځmg'9U )v3Gc4 ]RY1B{,$H?k[{-L)2H(c*u&?*JoԀ4@{H^;{r+ʜ$z(7vSQ;$6`Fo6"6*p|m{L&4{ CB.0cJ%G <]&!Pz5MTsjfaNHc\^W;@ΙJ15%I'jJn'_Ux!Gvɹ#*ly*9jS*<_TЭU.2Յ1ICh7;0bzc:BI ZX(,&^VO8k8I/N͇^?IU݋ 4X@kT1~t^T47֨PХ_N*ih.,9/wV%fխ*44 r3f:5G(ɗ ZXQxY!h)AG4 1z 7IRW* WŽaT'4k\+49ݥfhV{OԴKufm$j)c"E UU"^6$F䔍Xm#ZZlB#I\]n\=~e$gࡎ7 ȥCz3Y\B+ 8Ze~D۔zTP$Jnh kR gUt͠jr=D)@(XQ+C\dMí)s;$$>R?.F} m]fNWZn*zQ?#kNp}q{j*lֲy7''n;_o3Iq00_}@2oA7%̠Cק4 #7QݐX v"~.m3y͔vqϮ?ދʹMސjAd;sJrQr"[};k) H k=zb/7Ń@ 9F*Bt9&Vcm8,Jp;4}v7UoS *~,B%=:%s|DJ j}Y]zQ;-c2Zp/~?ULņ?|u XSU7͈Tgn5kVAjwZaGXSgѐ9eڝ/*p[kh~xv]yWw#bHu_l>jnjM nؚK/-7qB З Jjc@I2M{iJÞ55pg7+6i[F{?jg fn)h:F~6D]VCz/ߧ>RZG ZhԀKBF)hA.X{ĵ'4@#*~#~k9-gl VUoH/fBÍ*7$&p3%kUXA/Gjwn+š 7\LəO=JR=[ 6$gw1C7xAJG%F}WtP׿,:wUD_Su L (F:$#& z1@l)؂waBf'%c 812iիvtzQ!xըmFyv 4#9rYZn[#G 2Fv\)61&p ȄdEiI&Z`,:&K!mZp,sk|h#1Bw7heWsvyuvsp^o^okۋ.D@v 2%F K2Ҕ-$LŦ7~o]^^}_U)#xNec8mlzΣ1kipoDbR~5tW = v\*!;Z8ߟD[6݂W+nfϝZ6=5,l@IfᓃEv) WcnցB 6RK[Mj $ P #xhna4ek<>h).*>` >[TFͼ9B*'Q[ĀC&jb@vVx,rhlګ!գ$ƵhI"M1q FqfrˬE$үPJ&VU M<̀BKZU iQvF; d@Jp)t9Ӻ"yDGJhE mQY ;_Oy_58=O|JToRSQX, C\"E 3Gdk|8=ndi8/n?g>ḹARY%8j8Ƿ QbٕG?|uv̲k!8fWFꏏוrT!?I8`&-,%vɒ`R$؟NzЯڨKo֓`4 8'+ ]pl%|6+knHlCYu%#!3v F"SC66޳L3(PXHv|)I8Ŧ^ -ThdGA2;٭x{\s*=@'NlW>Vk[N|h!quFXN ʲ]'W &ǓO> sd9mnb֊_Ewl,4L.'b^k] svd6U#O*䵩Y&Nⶰdv[ÚR6ܵv}xO%VtCoef46=Xtuh4]?e{66}@ ĤmOhy@P|. o` 1UA`HKA  KԂ@s8vE Bܕ )YW]67H#頼-1bKyDU=FlIco :>ȕYn(q1TՀQ"ZŷHXgЊ[kU{ycG07豣]Л:ڽ _kiW[wŀ&B"LdzȾw}zΖn{fqH*zlzyz2•\ʎ8Ԇ]ɲgTSdpTknݮ^}n ʧt#_+ `' kWiluDxFw:hRn1H/@0dƥ_Uk ^YߎhӾyIM() 8CmZ=o\ld]]JH{8mW U}ʳ^V;g?tn7m X:*\T8L8YB,d2Yood]moMZ@;q]j_A'] ήm]k3TG'} +|m6A~oaј6;MbJ҇ٷڛ 炁Lxww ^QضSqp|p9{d2x`-:~RP4A}Zѣk#tO*K>qW]}a=)7#=CQvbk`Rwv #Q=mʖQ=m k2l96'xb m=6&n֓[6Fϋ%pa>\oWyrwr' D̒ٻ/ay wc?y_C!=e @%O אқ& QשK TQ 7J$N%) t cSvI8ZɖSpjiXڅ^\ڮC>Yڬ޾Y_B\uiazG(1uח_^[qA W1^ȅPBWi1-x>NM7VcZV5:%}U@V@eV@!-cОe&Te&jph@5.nPOuS+B,jV a,{MuZ]inZvuMp(hkjZ,ÿҡ2e8:v:F(bJUnen h5-TiarY4CKEZMIsۻ26fr!jv9LK7" ;J [ ?QgJQ׬K 7Pn]qͧ)&u}TzmR]5^Pݥi)!lv]u4[Rw|\#PJ|+9¥)$6= Jj0~ RD7[^<ZwwQ}ttPtSOrjFTix\8qY*guu%@V.R Pf1) D Zb*@J%TtU#DȠg3(Ht_lHsL3t)^M+oI3Ƴg{*:júQm+bb 3&V$VhEy!-3Ħ њw‡o.?LXY~'T ^%?Ͼx;?ʏ>~Vzl,bnX7= M"*3>Y5g'狿\dzw͂B?sno--XXa˧'7,2l9㵰[UJ/2(6&W"#L i_l' / j2?K%\VX+2h k %I(Y۠oǐ8I t0cѲ;Nseu@ZbWfmz4[q^/4~je[ozOA vÞMtw]㕱iGhX 9fT+o"̨!¼vmb-G *mBV1XnWB;R^өIתj_5X|fNii7l>9%*˜ݴ!WdP?OCZc`|*0h*=*4k\\Klتփ 3ʬ)"#aV[w`jABպg\*#uPBSЖ[_(VOh.iK 52uyd-"]9O!k^HͮC$ۛ"V-V۔`[(KEf/BFe6!XJ6+Ī8WGZ\lcNǫyT!YTxh2n~H S*%iyHJ> %ʖ&@gƏ#Mz{Bl4ZOA-870ei2-]X_ȳ.n$YaKQksYBQ$R6&Hr ,h)&d݉Fm=W.k Tq ?%?zỤ?97#1`Ok(Mc:H)7%2NLt!w(zV _.ԁ,3,[֓nyg׽͔r)#qvb;AQaui.^B26"%MUZIR^!f@+V&MϿ$7N@*8iQɳ.UU2bOml z>|_FBG}[K>ێ=-Ϋߨ/\V\ÜRսtUOSu7vz4Q1ƍc5=So} ßvٖݒm q XJanMSڎ-GtK>goǝ4M؏*I /;x/]FtNYu9b]؟'N~|jOFNNN6Q(1oG4cOhfJ=ޱ8t> Zor:9y uW9[1餳V'ABn55 GYT;ޓ刪ο@)5tJNT/X~ǿ|oO쑳۳&Bgqos~Oyq(WIX uQ eOIR$P L%[ !4$?coRN^ڥ 5-XoOB |_]/P6&_yz?os,;?'u:J_$Ea^JĿ?.|y _\ ]a#c0:ICSHK]8")j Y1=kqu,N'I4|8M ȳIwo1'˼4׫*kP ,E3FR9#e5}|qjκq 5oW 1O7 mus. ؈f+Ǒ88&5dæRC!M79x2?xFp60G "&Q!=ݘ p2.Q䢄ƭ-̓7 攂4ht/ krW րVfWtA W$p6/}ɔaLsxAyT NT dJt|y}IjP))'YBШ mfj*wa,x>dIc pt40Cmї,ZsʵVႯ$gDohsA).ꩭ"e4u%* dI-ûD-l.-)a^_]Q. GQ뫻`ӻ.{~`{^ϿIri.f/$bH_̿M | XkU}B{ٰȥ6E7#/)LfCĂ3`μ*y!iI%S} pHdADD'؈o NLi5sNQn0 (&z'0'ԁ3hܺ0>RU^ Ơo*'@/K͙Bq]4:?*Ҭ.X* FHǴC01Rm 4,D9n YB*g`o-+r yʊNyFyntL|n r>@s>~i)cEDv&XTp”f:;lD| >% U2Tz-jɞ-My*0V%j=gBjAf~fYa Q(i&1Ҥj7"Xj=1Z}X&Mkʦ $_?i{֡[ƃ:`>%Vq=Q|kinTo8id\-")Q.ͬ4\Q{g9#!tGE |b#YՒ U,K*U$*]Tϣ >nYEIx  - Eс$RJmE㥷?H8 \+sR?M[eQJ57 =謶iiVTVRPF}Ƴ0DŽMVM1|Vk.îN&/J|s"bݳ|%/A+r84H[m.KJTyxm<Y `B٠OKn4?g%8N`s[3r/ &gҴDr~2%r(0 h-9K%> eerq3 kF `e"@@]9{tPS@iroꍠt!i0 %%r&#g̠Dr&hu p#Ϟ-Tдh9 V-.Vخ׀KXPwJJD 6^\Yj> "učsPK4Eε ݲAѲJ A8+QW_c%d[6RJƟ,IrArWޭD -p=n uӟ?O03 ׽qd%l^5CEsd28LjioQyp!*c;# O#=򤅹04zֲ N S&T.ήs4?HЈ9{dN[\Z=$a JdF>cTnɁD%af҆ Y0idD2^\o(#]4.5w-*SBCrC_}z~&p*>'a} eHfqQ ٛU+mj҂1q+Raş~v3ɨJ\gw!ͫJʏWTT;.מ9ZC/v׊Y M+^RHWrk1:ዺq &~F{/Z;ENWa4nz g8s}͜Jc2Heџߦ uNCk"njFT_+_ ^w7zP#xԂý`v ]ݺ 0i٣)uG)K"zS" 0^>8khDUQ=kfJy3edA%moX D2{6V 3ݺZeŤ2-zM݂-bT:n]wfzOu{5yc g-[ Y?"pJ鬗D+$hQ^8s f E L !ų.=Mjqa/uy%h^%ׂ牢z-iV_s*E^M2@AU|u-{0MkDҕc\v؏PE.:Fl.<+:5ŝKjU+^z0NZeW=ZrJŮfaS7¼״ow[Tn9mQA@:|vizhrRh^ZBd23JFWV(5M[^)!#(4h5LyQ'< *8xS\ͲP6`wӒҎ;۟U,T w꘵O1/4춐Nu$6jY0w^dBl]PІ22=VTBcuUޭSևh6d^/kJ*#{Ѣx ѥ_)Eɸ)CA80k&.v3uM1YƹrϼF3ӻҜsS!1s陋R ?i#!߸6–ȱ fta%P5Š4}FvH;k0{nͷ*j6$F2m3vl[\ ϶:+Rgp  4Z{z_쾭Qui26W^C@ɶ 5:Y2[hfh .DI r~ JytfQͧ -ݳ2 teb<9Mxd'׈d&:i߱J 1jmKFXkS_wIjEV)'$!U PpEN؇Y?`^gRB& 5Okӫ_ZXcniPG/w4wJLSaȽv:qdlOPEճ,1sJj>"_W[7U7k zts /O E`,G7WIQi1^ꠥ8mZUnibFH 0=yYP%^ @zRtgU,7. aM$(cZ 3 w\뢌iN WYC:5\ 2ʝX)jCtBkCPnQq@ì0`3Gx.o%.: +/`岌)gz8_!evv\Tއ= ly1ÂefFQjZ}dK>ūxT3dQ``u7Y_DJa<#RtڠyW[*|Fkc4,.+إv.a*I,d8 5jgAςc*")F16S T;啵4S#9DL8ngA ܲMp4M~rAQ[X_}c#=-߿SvuT]DaaW`;5,˧Giqr|[};b_߬>*<0˛ AD$7Š!9̛͗4zu$P$s.'uw|tχm9(sۣW\1LjÈ1}JF78hQl !]TIw|.8q?*60>WJQ J:sIrp KUhGF%8XTlbep/c#>ΘԾXK) ?_*\E0&(pQ*;g)fFG&wM}]*4X_]]˯hR(1&*2R+%Rl5Gio|Qi { _~^|Ec8TRZI?̋e|};qa|F=J? |Og"Ilܛ{!,;\F8̺y6hcDG[9&]<0 D\RE=KF$Jzu^ PDԵ;&*Uzx=v },|^f;ϞF~;8,%Ub`>q%ORA`.?dڿ=Lxf3lvF-vjD'GdNN10=:hjp`w L. DUj#i Hg!2Yd)W:5.mJvx$y᧽T hQ";FiʑȜRSh!fG4)qj d;#!t&7st&B>O嚸D0-rL{ *dYrS7V /l&A9J%8ˈRLZR3(WDKt_B7a'=Z oPP0)"=^c/Rl\yשyj3cL&F)ԙG* &~7vM8}Efr:&K4ʏsy0__4u1~]%]t⒮qIt8*ӐC؃8l$c%e2'7Jr_0o.yǠ!WCuY9R˫#iIpgH`zb0K@tMB\MBñ nRE\3޵Dk̂R>]d?'@[ŨXa]{wC"$TQ5,n(U_NfPSm>}8q $"T{jގ"F)-8^q=WW_W=+AAa qB3eRό %'#yyej1P{o}(6<݇~o[L hr?wDXް-YLK)dܒUp-YwVh vng G,1܋ho#lSqQ,ÌOIl%g23L6݁jX7BZs+*lea*|/[ NH?;@ "ԹG/zפSnY$:fv4Zk8D{XNy]’>6aXB-i!6iEM傟y鲃jCyXY7)8֐ԉ[ka֩uA :,2ր J<5{ BU1Bc-LF =,MsJNИ%@aS$Mθq+ Uviz-XŽf44޴`[ւg*TEF vh:Uq[+_(|EBlkJ<,g']+YiA.)E沔Z P.B.df3qۥȩL#&e0휒TϬRF-()< фi4b  3&'7L:8_Ay-K3Լp2oFSz 0>2Wua/sO'q&A3iG;f6I[U[tW ha!SmϚ9*9>+Z'jͪZRK>h<%n1T"YV2‘RYk dO Wi$n8 f2C{^;^z)1.eSUGJSiD> 5#hQZ}f'4$ zJ*K(cc 3K,^{zXJEY<dh/VjQΊm=(l5%v*:aܮVjksEjEAtt+m Z%ʹ@ۇUGKE(+kG=E_PX4ԀR@=4Dl˯Β2kW s{KuU>ÅDiױ56Ҹe~+mpY~"C֯"Jp"9 zS՟yt{Wh+.^Y0 \188y6EBwC8|kY(R\ȦweL(75)e?}:< ӭ[Dgߍ1XK!"%?4}&eh=Y'd6od>3> osX.򨎫^J kyA!_6)*[Xݚb:MQeu;݉:TV[h#RT_>L`z- .{eOh>TTʥQj<9캪SMr`"UO\$є IZ%׆+{w)ugjE^| |td+1 \"&<>F@.A+sE{[Ģl|a^v ̷ `Pa94e+sW`fj))*+_E\Vn߂Efw~~bEE\ծL'F2z2sԒD aeJD6:ljd_@PmTV8TXZ>' R}EiPHc#|CZ8-PlaQލ ~㷹E$x0!P/tUp)x&s0I٤ ζ]tyE`jƑlveS1TO'oRip _|ᥨ^s94s(&Ĉ %zhhj׌^c^!><޾(:Ziijn^( "!kF_6l)${\jhDhې21y`uP[5b//zdhˆY!;{X]/[Q<`*k`|<1/ӌ5> :WmeO\˰歃{ @5=^D"Yˆncdi䴗+G^g1bB H\#{jecx>D`0'`8zx097W{x?A;PnVnL$CO X^<ǫi(mrێJPB\D1 QhECGoD)h}~_70dWWAɈvX nڈQղ PWߒ6!C LG gm4H&MoO. 4Y$\YbI1Uu:,Oo7>PY21{}=&S?_xi7@14,4 qN N_0\4aV&J`d(K)IݕQ_Viz׺A@(J6˃ogˢk&ℱt>E%wwuw:ե!+2O*IFҮBc.~$S훬 UY - ϱ*:^ ,$jAF%]h9s]p $^WK+x#J4#Hxx!$M=d>e. 06&BIuSqyg$O]ȲwKz ZA""SwԼ|M+.W2 RCQ #D&hb"x -VLxǠ2LP*3ǐ# &O2f24z:)H׽/ 08)"DHeDy^Ii)H8'TReeeK5# LfS1 -θvy}SqRk awKa0&*#^sG)i8*R\W>*.U7#8t,4'JwX }k6DpPPV'mO11 i,wBāfXb8dV^jAFƍ8$RH;TVWB,^XE7Zb|"Q˺%&9!j0'Ȏ@Γ\K@ZL %UΆ_ r:f[s`:vZ\Y0S68LϿO! PydO4.Q*nL)%f@7ӨǛN[EHoV|0b.yǠ!WrWV?cy iб 1dO&)E$kU鮤Dh|^ӫ6cE77&}Yn&mW.Ջ~ڕls^+0`,`vrMW ӗh#W xv Maa6~] _8'|ℯqy]s8WT7%h]OLMvv?LHH|Zr&٫߯A%H$嚱%>n4~mf! IAR2^isE}|r"r  ? LFzT>;M}ݪ/0'|iqn߻vlw!Dy6ڬ~}fDhq\,<i<ӂj  \4H(T9e+R( CBh!4;\iaLiP:WhPp4yE|wpR=8 I.2;Ebl(LF>a b{Ѧ'JxCF3o4j%*=IY<@y|CYg9brunbc\,v u t%-11Q+nS8BXγ\1j4>i:˳\kAFX+Cu1ָ$$Z{PdPgI ge:sH] rńTzid `rNObn@d P~&&\ ReBNT YAV%*WNܼ>n9mxSHE3*!qTER@"ͨ#$&6@v#4 }j\jMG[_:S}bıj hcc鋎,[]=:+G :lOoǴp8IHL]Bp3]q^ 쳧j?KV,HŐ.*Ym}7?^bz{n?}&AžIQq3uk8JcTsGFxj{Q-kTWdT!I̩zƜ1aF3M4fHhG 3*z`F/QƏP#_%Spk/7*9#2%ela@-BiL91S69*W5\0,ꂫr<a"*izL:*_~,S)$_ShA.LVqkg׎.&7!u?^gO.HgJw]|7 gӹ-BfcUxV!hqSјL9ѨU z.8ٓ4* vhUcԵ*rLwy1B؀1,-.֚(qĐ+b^1'!eBO+LpQ~<.C@/uh4L6W͒=,,ټz B]evAf'l78p3badtgw=Gp@H6Gwlmusq55Hķ7CLqOL|Ӽp|z6wwS6桻L{vAhpADQɀ#"/Gq'*ӍQ߬z%'eBYy-}罃M5 o9a|ZFݣeuyv1{8ׯ>Q DNQ 7}&=Ƣh¹&cOuц9X.qiF;WILPy^Km˦DQ;SAO 3ɨU='"ݬmb'dTxx=omT- pC (Rf,.1@9@h.Ye<"/8΂8wE+[av\;jSz…DzR8/6!s@AM.Wie@Ie0B, 絃@)Vݞ`&bѧVILXT!pC3+s 'R8C2 2fd25[R{BNg)*o,m2˭$NZ/uNP~CQރYEټiC`I"qPCS\p>HPH ΢L>8#dńaݤ17uEa *VY*%b>L9!JZKr0⢯So˜ya}E;sd~*w}l<^>C3+ S}S+aꃟÞj^*Q\Gze'%r6=QL&JQhrvv;KLJkWm y2^8'lA'n[+V@fm.s s +pr~gđ,|Lfs3ܓPHϧI #K҄ԧ[,GQ^6K?CP<CP˼Rm/{[pxYn'Vk[0܇Z FjVALmf& QˢK(t˻.hUJ "P3\I2sZPd\Ҙ\ۂJ/(."цT~?ͧYhgz:SYbPϦ 2eKZ kxp:#L. : :G 1r q1htKFuFhMBfF:Wny ĿB]x .-S"x̾(-lutd}AsQèHsB'WC/{-GE?)RZd'Pp JR d*:G5d?b=ޏXS71 #;&1RHzfffay 3 Lkf|-:vS`2F56.nJ3}Lg8h<,Dͫ Q =Nki9w"M)"/~'sw ?\^>I}QN˻+MxSorf{X^+47ahwe6!0Χ@Ms멁i.l[D2 i$)QyʤCׄ1 UhGx UxO1Eg*yi<9<mw=HHyjM_<'XwwKZR"@*,GypWy_n5pM$oޜ?,k|ityi3!s-Zs[>XG<N<=#> JgkJ5N^,1#J~L"6C(C Ix9#!1ڿiBIv8!L3 )h:Ai=al=$iܩFIyˑI:j9.IjLm=&'x^^2qy[N~9\/.)U3/ 8br6(q8 N{s},{uLoI[v Y+$D9 C]EB<-L.Rf )(7|LkAE9n'2gPWEJ=׺GL2.Mh _ik/sImTkT<׎j[茣9Jq]2R2a{~Ltnl { VzGt76k'B hI^>#vp1i8>D 茆I~ʺg#@R-kFjL1#|!r$CDq#GRθmF v+5oϟ>!p?JSe7*GM5WMK˳Z,>OKe7W>\Mpl@c9K :Z4@͆M rSLbzWa؁zTڤik<&|*q7I:2 ֑ ,">>S( w/䜀25oۉoX73֍ģ.nj$yuO-[q.1H]6 f"ǼoGz2^gTPY6y{\%`|ypS^f`* 1DA{T%8;g@Zi)̴R}RBeHv3d'PmǡSi$U* f>W皊,'"'0P@)1Fv-ݝj2!`GhxL-ZuVh\юuڄP} "@L GVvǺݥZ MDb E^ps,{x@J~E+r_w? mſu&J%KsWb8+?H}NJ0-ʏ:^,LAr"Lټh[(M))={+ʣB'V_ȕ!ݲʕIBrM)Goi7E+9دR1wngr=vKO*n}HW.I2vѨnh[*166jD[zHև|"]OSYjqrS9?^EAeȲ cʮ'tbX3o=KܟoaAo[\LB~5~=;9ypwΫ~r݆%6<$d) !ekru;Aeb?_S|ĺZmF6 @e|8h ۨlSd:1xR更Pvp:j9u;}h7cjr-Ul]č_>y\ >iːb !VF5"hl_Q[gϿ\!?.eTN靟w%TUj}Fk!{5FBZ)rgaN3S_k;k:"g&e;,[ʲւB՛%Q5Fj3ˀwB69*T'C}ZYLMtbJu;MsV G >\ZQ  @(7BtlPci1rZbD]ZGKǍI/٬!< Cv/؝63/mҮi% J,L13%el $`r|,7q̐7=~&( aD37F µ<]`Tst{@A>?Ji8溏R\aRsdls \_m3Z!W-Z`9Qmh?+ WZAnvPy[9Tq܅$z4]^cFZ?DLm7{ؚ^^Oy)\/ù]y`T$/ݳ'lvXk\38uƷ6Fq<5l!mp"CɣaCPӹ᠖f@ {$SyTP@녣um)T'QHA0u`@X^DM#Z|x_`%U[a}_ !0ǧ/ 55tCJ`L_@7=+o?\kq)yTwS;NMc/0GS:Cn^4y&3 f1QnxgzG,W?̷?uʂ8>H~*mMP}8gzn7wNqCL̳ݗ"kDk}{Ɨp!OY;H~~C3q)weTb9?e%݌ 6^'*[ (ph`1Ҩ Ym#x^qz !RI6DA3 8*BTM;cV(p`+qT3oH  e<`BPNUk*|g$3 QV NF)Wp5q)7ʢχtz|H">ԶJ?\mW|ʇ!~|@\߰7Ѹ`^7]}PaWx VEhL翿z1/&ogu&{j3"yN?,RBsk> N6|#EݣLQK\{*cCq\}(# @2.$՜ f̹$%5H jE~*è`JcCc#rz%  E # VյNsqXp&2{udFRkF8.y}QXʐg!&YXa%b[ax̰ zlɂjgR ΄C,n%&(Q_*ɏH*O/SH _>)vWaKsXѪ@3yJi9Z *eS̯^lț9.M2T)-ո!ge|~f:@mTG:RpzΙ"vj4z#E, ZYKAWVIi7{ZIu!k É.q@$x3o۶M/ Tv27˲Q(A\^W{'pSĵ00B6^GB̭[S*-ikŽڈˈ%k3Y|B 1e2C;wxnRg!ro*Df^^|J*!kbuhaCa &:t;|if䳳c 8]}m}j$> a@Wgh3 x,-BSn ζBtک?<0Br2><:p723(`HρBg;uqZJ K$i}6+ُqMCW/Ȧ b@S(Smѫl2O9 է~Hp>0I}Ĥ<h gc.mӥNOYlhlƛϓ0]}g'Or8k7铚SSDPicY>+EuZ``Σy#T׆o_#9y!wbf<_Fn.}-ޒ6Իe3vM#Ce;& &UG݈\645Z ]^B!2-4^} YOOƴ9xqџҎo 뛔^l$yno19Mc/7%0뙨A& ' UK; RxG:w2PLCd*bT@4pcAqv5 xFPٸZ,:^۰Yärut䵋Z) :@&[S:}] 5KdRcFv6wq\XjS ՔLh: sP- N"JGDUQ~3Sg %PNJLxU= O>%# EzOa-O)&bE.6R5){9)8z$msߤRfd43gv@b248KO7Sއd}Hއd}hZx' 7Z6]< :Ԡ^qa?<@J3ԣy7%rf.1ZAZGqH8QhpXI5Nr|[ɪ:I`@8q:G( !zAc-4ځ5FM)Th^NgiXIz(ʍhAN gFDw1)AUʠڴrPwjX%cPs8܊W8|s%J c [ؓt* u1eZdAK{AZ݊X6*AO*P顜ƪ+<xtH/ئ(2hQC,zOW&TK >=OQd9N%ä5qL,Svvib3gToE 9N@~] uϻ|Jr;L#a-6[L V_c>ijCaW٧"nkdj*`;8;܅l~]1Brw׽@x1Ckt&[cm$_cXFyR07R \/6:QQ rAI LM-PRvo~K}v>KoOU^IAV}sOok և<ʡF5-@+Xo%U]@Q ɿ_re/K'3P᪋Im*#O(,TS"=5 xcM4 hc.˙\#!*Ҟx|y*S:%NԊS:y%O:k&@c8_]Ƨ:P4SDI9c۫E62+BgzS]+PI*%<}iV!/g?{֦"qO^Dcj%3 ´J70vgaV9(S0QZP䘬UǙK0oodX{K-@R)BN :g<]s.}>ȭBjo%Ayt& t~p Uf?}.ⷷԸ#J,ۣ_YZ㛏6k(S.YREBQ2Q G,m~;~7l3Eݾ_1\N}=_4OJۘK^ezxGh6YmF /[<$?a1 [0 ccby&?$-edwkyH*~U,ȪNiN?F{޷ql?1aÆڝen?*! M4˦$]|pD>rFA~6_hu*nΉޭ|&eS䍥v-KnNw n'9w˿Mn%,7,t.5?nEr1H1!͉鑙i&EZ,=7Z.j_Kk%u0# ƵT׵ݧl$;)ˌ HqwM/R$]*J1ym?MqL#9Кİ@ +\ ri&=~̥9P 0qt{@HX0䜽T: +z ϲ/障#J ^ "}M'#ѰBݥw)dpAx-DtQMUu*j8b̯N3"/W{tKw{]%¨]ΐ {v=׮DQNtXk1D=26{ԯn9UN`^䚪׺ yVUs"zS}vgf Ѩz=b=X;E=¬ø1TyZTT X ǑNddzi?$ɋUh?GܭDAfW1)oe7eP=p4Xy.*'Dv(.wg ZoRgd+g"eOon.is}x5S~1wo.WlIowߝ1xbo sMi2 +m@4 xJ4$h 5ΚLh2C"$96r~)ٴB&16:X`Ryl<„nn=q D &rdxq^KMTJI4ݘ^0}F, R!2uK2͕ VxC /8@Ba}H%4( ƙO%0Ͱ[XP3\\ ?Vt9Qp2W*O2P:bdNqI:m| FdX9֗ArJ鳽l ^p ̨L˷kS 8]I4(RV$M-ljzV}ou|s; `@Π: r|=;⏏_$ e+AL8vMsEN jdE\܇!,\$ѩ/mz"b ЌOTTZ%oȜT3opxjdV`tiŰ_aFVdsZ2{l,^Xڃ,ktJB  2Hn+PrP((:Mr%6@( ݮf{& Kl#8&@ô˔JKʵGƘ*j)@za[o<Nʣ!h!1A zDO |&sljPeHGo5(>@jX 1۵tBR.ȳ o8U>xJ*]B5U2g1 8*mUAaȉ)CТTZT0h |h bܠ 16:8W9S>aA]Zz<#'dӴL6^kҀeDx4k~Xx+ 猂@' Z׶\0mVxAMW$k]Ӵ 6^*kZ{Qʥ#9zFUlTVRNzzC@Pm9N0=7 `H9v NGȰsSKÖ@DBJIQpv0Bie~] v#لUpw82"sXhMNU 1ά%S*h$afk5 Mqw\dhPJu7qe Nbc'u%-"(8-vXwۺIP "Z1ҴiC}¾E*2jZg]ePذC ɡk|s\,?)7͇B7k._D!J/E$n.Wmvy;6H\ muQ^J9oO#~Xw@6DV+Rh;;$]FP(Z3H;Q c܈)@i1 >c/g}5A B7͛kh@&OnS^"H2 cOXup!x @aa>Rߏ6mL_- 9.#~$ (g.=, If嶌0—{×AP `UcaPq}FVUbH4 HU'':'ɓ#zISNSS>UI{0lI7:=* ߱dȅ Fe1E\`DOq:syHS@>a. _s-ˎ9~[1 QA{9O/.=DC%itzD?D!jY*E?P"%?c0{&"5E 1VU*?i\T)F0F/'8ϥ4@>eo# dvߟoϨ\|9wt9c3$=]vŃlJa#=kuNXA0'JǖR kyZ !FY6ldHc G$ ?4ėΞ|+;3 AGڽ 4[ ?.Q} mVo;[[ z5-û9ҚV*"jTa,W/?{,ӖYjRҹD ÞKMC3eؔ1C=)FPY_oߕ~CzӭM^!N])ݯf}ב$fخԵgE\;kr+@!\Q Qw^}’ &^ӐkYH9@m0&x0:В)tsO @(u> [A!ϱvk9Q!  Sy P(ҥCLwDr%;~$9.Yb^x\784S {zA̱U:؁QLH~W)ҾLN/L6rMFIى8z7d8YnoW77tN(񐝖NvZi =h/j>a>3o/ͧ4{apz~enV70O_`5{V=;0/]2d]G[ Г8uM^Ct)w )ۧw)i.0R52jThV>\}B<=eu ,L<6| ʋ Ɖש)PPB9Z}o)%D% "9׬PL# #/Bׁ6\h4[W5$CmZ X MvO%Rs%,ϊ!>mKDZSRsVe{yUzܫw>~X0[2Oy&X/]eKDzqɮV뫒_˗-/3 ݧ,ĔZf/]HIj_Ì=9y2&4tw˘ǯ~>f?1A[Z鴕|zRQ?tVQçtԣrai<&Wϡ-$L:>hJNC>=(:D 5}zr/ ETCvѩM¢TK(򴮎^kG"L?!X}>ӹZTeT8{JKYԒwK}Zm}O6{ZJNg s/P-'.<3{s嚈(=:z`Kn&= LwW߂b6gաKd̔V_IFYUL ^LDS$j2iKEpHD>a1 'Taϣ=2LB%i~I=8$Zy eQGgS:َθ:  ! 'Pc|CF^UL0pwjH5aW5VjJh V WLy"4 z#hvRQJ"cuFIK%MdbF*2jl[/Bܘ5 Cˑ!=zg|΅|&mjjP 7V >ц[ fkBt䟜g̷҇ىri kHt#9S`6| huJʖB>ŸƂ<ͯB:gE67öMYXJ\݇]A粂rE1(kw1Wxp~/.8uXfGHjpy Q %U9KqRe(ZRmqM\ ,(T\cQR'6kTHtgd=k\ ң4f&Y:Vb9K`fiYWa(sgKs̒U?weYj|5  ݟ]m=xTfKI'"{$Rq$ʍd"rMcgVSVVSFHA_6VIg5)"aY+.el/etD +Ҹ@o4uOE_Ћ_6 o[3W} CuRRvr$Ń:%VAb?R" /w5%b* a @[τht\;FSs4 >|‚-8jV7Yko\(!-a"xq"7Րsb)Hu,8"L0{EkMFpo05N#IzG >D;Y)$K~9uXdY h@7!;$/mQ ɚl ݛm4bGW2-A=_OM23i0*`UZVH6,+JD+E)(,sD=M49 JB{eߦ"NAy%__oM~_QsKNc hԩav?N_N<4PL|z RtXOSrB<ޛ~jW(sީaD!1YЎ)0f" QGN85HZb nP&1V#;6%NPH*@##UBtb -j<(,֔(Xr,}$zGTao0`(EprVO2M%OV&mkKor"겞yqyA<( u..1\DH&}Ռմ_D3D𦡞 m!"ߴ7I@'y tn %' 1#ҁ*"sĀB3-M}Pkχ37M,v{~uVAgY:f 1x\nc?3]CvGWYW~L.ԥ_c2^vw&yǻHxW+q,aV^ĥ&fTNJ5TNJ5S "bG480bqQ"8݉@!<FjX_ JxU_Vmى T,ѻš.7hCBU,T?"]R$ĂlBz֮ 8.5Mqig c2bL=]V="RQiH {An11] X%A]B;J郴.%VʮZت9w]8`Uë0lǢ1FN>G `fZp;iS'UB7G"1:2@tcG`'D E=H[fXK=5*߮RU*W5Ȭ I0+ӈ>nU(sb˿˪!ސ.QE+ l2;ɵsRVQ-wyE@U{dm3J&2xo.:D&kb޵h2?KNϥp^}詁jL̕OMD!tOA*g4r]M7_oSԪ]=O*'3xv?WUQL6 }rpg)Cߝd?zˏ>${w=&bU6]\ `Z"4ZpAcRaeLER@H1ɵ6Kb bˮ'B#4bġJ"qZ?*#B4F,\ .Hf%!rdJJʹŠ[(BW rd['>uXkUW!%?C^%/.NMNjs:P1CMa\ՙOX-*߃ۻI׭d _7ps "ňO-Xtۻ2svYcVThTN͝fߙaM}50A`$7@t9@zAꫲָpOG+'i%bSG;=D N]%wu6r(Eݏm+\-V*wBe;"df }\ kf8GѳH Wy:a8Ms}P9&\ ;-8|S+|ok~V 5[dxg6.Am3 1^󌅮7ۦXM5$)!ehmVC"knV9lh`:HyAЃ;I0y_~tq,qv/3ۻ8_ʦ?jjY)GN("rN2L)u%qF#NZ[Ձ;$N*0Nm`UmXLIB^핔u H+Y2%$.փ1H$Jq4 vYIUIV& #lT F5}t\ Q rKUF`MUaauXƊX1 ,{s WHҜYjpfy{&l+r_XOdjkz͊9L+oQ\e$KkGJ L@D)IÚLc .`ρw3ߓ-bD|\\+oBz♵A&Ԁ=8tLd5jh02 `B޾RՍ@T|==Z{ I1ʀl}- 8ceuJ:Q\E; 8V޵5m$˞]PsJ/xSvO'/NbL IΞ?= E$@ K=_eG aKD8i1 XBDǙ-rsuamsCQFˁfS hTCZt ΠtV$qzO^k?6( T$PJgPu:7NWŲҧC+qgBB++֊c+&ꄴOD, =TfžtsrF;i(*(jFY3>G WDND")?m#b&HX}KUNf5O3/T+V}<لBL}hY܄:W^%7=mC$@V0D,OHYDX0s<ñH ȫ*T5 ^Z+ʵ8A >^ذcw ѻa!ϗA_OTݭ*q ’Qҟ~dOOIn6;x;h_q #^Qk 0wD*b`<4kFK9WWvqa(wM,G1ؕVXhzs}ݸrbyRfll`of+Pm\[&[-Dx'(1+ۊ> *H+m\Y鉬bxgUc,,N(/++3A4" +YT76 v iuo͞m7vEu>1W%i2Hg#_iĺy"0MSaFVjWk),V5+×A4,@Aܩh~k[~L !N)|pwF ̃A}>U}#^0WϐC0 /,׶z]>~VDveW}ĖNv4 Zqo.gøM-8s#ӏ=Duqzi~rGLF2/2؀pgONʳ/[RĻ'h͒'/}gйL]|p~i4YbW,Ga3? -AuRɭ':˫O-uvVj)`2Ms33헑=c0˜5O.  zݶV;_d[kT]Bu kT2Y}LZuiCDZݣiC<#& bh kTMR5)haHcI3"A'6ˈ@ R&bdL3LN-~M$>ya= 7jи/Q(HU%uM;kFυ5CRszҮ˨'R8/r0ߒƳk4B^SjOj-=ZYx&5bHBoz`6\?jJx7&(fyo=4QT<>=ēӲAhx?k~ų 2]mò͙ؠ޼4>M^‚M٧bSքJإYلg5وGц<~| 7EX5Ǒ_ +_Ncf̥x1b:'Y ^ؔϾ*ten<>hb-FĵP)O<hNqˁZncTT^8qg6&)N#ft*pf(B ~q˯k4U=E,j#+0G_#zx_cA:T=:ݡqΪ~O@#,ZF" ӚA`VۆglA̙l[r/.М#*ZW}-5m+׻e8/ZҕD>Թ. "e/(@:T&-g_{*qT5̋/&l9eN_"~l˟<[FhqɭFe+z`Yp(Nݜ&U`v/z uuݷrYA f<$ݔ/0OC ¢t`""`qow6~#ZdRD49-ioaϾt8v'I6:I l61ORff&hRU-A6e&z> zZPSk@hk5Kg<*Xe) HeZTtG;@ \lXSoS`{LW42@"E_=x3zH ̔H!DUt S٠GDm9)Cr{ N_02y$Kzhlcν\[Mx 4 \sRBg'tjqd|qa\ 0"AJHЙL\@҄SdHpqR-Th)1]_qpH{N@Az is]zqɒ696!w{8 Ι{4 t~z<XOWyો8+׳ŸJżcOwr2NQw?:Fϟwβrsx Q4vw^Y9 P_,|A -(qѐ`9DIߵk@gC*(D-@X]IA@ nR=8ǫ.O̐.?MujF\'`F5LJ6|{/\*>LMݕ_jB Z Kxt0l"õ*^!ڦA \,fѳ0DiIȹ+ ւɮ=hŘqXC`j66#`a)I!,!4Y"<cJRIFjPvjEƆ쳝-K*(r7R-+܍Ρg2=>OWl)K?^I~9!6{@dQYٍy89x7AړpJ{ݤwCVP'|l_tɵCުJ I!AWzwsCYU Yf%W. .AW#NxD Ê%L8RX,f( adނ?*\(_l2T"19+fm)` 3D*P&%a 1dD%N8P8 `h`ĥbڹYǚ~ﴅ+B'K 9s1r :f/gϢxe<_?O4Dxa^7{Zc( ,5I T8}Ji]DRڡZRwvzJMkrVVx0 v0/ay; 98L W#|^M3'Ӈ Z-#Bc{U'ybwq"z2ww[/.F>|x "TDϖ%1S2=r<5S,< ~Z1V5SplcP.~~F|j+q4fK4 v$ED;d̰˪叠VjUaq* #~ pfi,})50vy=3僊ϻF9ӷw`C#GUNk˵NZ8mθ =㬦gWaA۪[K˷.g+Imo~>b\-2?w1)%e4t냨}?9,·3bS*-l\Ǣ\ ,{l#{['ӇU >/wT(OGNdtmv>Li>G8]}U/֋M'qVoKn˦njJ};&_}o҇SX}4;hoh沶K\M ̑fv}&7B7YZIlk+z< vOk]L](͑4A3VrVI^#[A EL$I$ B,y #DP@b9f#/=ad$H0'2ȉ%xK9w}ϫilYKfp ŽT[%g$m6T@Bnu'yHejyf>a, q-^$qY%r.kHQ POgDRSI׈w=`gHMD"pmYl)~Ah='_dYqKawH0 Bm"PI &DI#i,*8Qy?ObL耄IaQRH#'Պӵr. }n5S1F t ` $ѧ+}(9"l]H9cB%鏇SX8Q<س+'ɮQD hAau NZd'-IKڦ\y =Q+Ұ†:椃RimwBIZVסžVĹG/{s?̓LhP;.B:9T+R+r!3Wl~ sP.⛺T7#9KYGw\!f\fh( y+1*w%]Bf+_K)s`18F9[1ڜWh<pG@ $2hCj*=LÅ]L^n5/^1RlfIIV>J ^/?Ί3 "9ᮕ!J{q!#>d`]Os lK@ՙq.;{ajmb6I|8`SZ{ ͘tkw nr]3_v_`Qw īsa-m$G!q2Jr1c<8s)PZvg +TW1e9!@`/ ܶf2\W{qA-qCHr!uY"@R5Q%7m?ݧ?fߍ"5Sx2^~Yz]lfHup@ceZdaURCe\ݰlX R\bXJauX2< ͧH @j t2v#‘躦@qJgKU \)3Fb r֚KӾm-˯b=j4jB޼dެ'xSW7D0DbJ"ifCDLo %VBC<6t()6Zœ3\@)=CD6Z_?ce<:~RCngي IЙB]j=;gCp*afa3`q@aNZmP횛 wHW qw&GH幀FjФ-!wVk(2cV:l"{pYy6oy-yGo!GIxk#ޚ:Zo</Y={LGk Vmk7^|=2duQ/jl5cH 9cR;AkOѳe !:^˹:nꟸc|O2$ϓp/)%w.`R ϗЯIi3;stdj[b%No{!sڰ{Q|]l[ihs xS&br6yL:UCICmiݑ 7=%-ssi̿ABD8Df[ޝGh6~&_K<$GvQho/T:p4\@NvE*G,46̉w>H-㇧GOPV~:gSRʎUӛcQ_r [GR"dbls̈.1FDmL{V{5%haf@dz`'mxA}hr (.Va7.[R$' !l=٢o0:=`c!\eU֓ , a/;缫ͳBe 8H auv z#)bCQ .( Br$$dI%$T"QZ全U-,Nj2 9,BcM:$H%D&SHTFp 7bYN21EqHaxm߼_i "q`q&  qIB}Ҏs˴]&myZ:^b+ `a$WJ\m[#!o_&-#]>C!vգNE+ +^"/T[4dn٨U=n;?l6YK֒j"I0 p@FXaQ\\A7ꕻDL jY}Q/іi*: 3^q;8ʒPz*Wy]-u-0H]k4ȍ]Pr>/^Jk䝺mviÜ +el[;(?uHvCA*P$L$!:02QN㴪B $ H H9^-hA> Eyu8NW)p߷c9Ncj*(B*mc.XGiqM8M , QLRc!",A PAuԸD$EJ0J0եR6J9,PM7uNT)`sx!W̬^X(X:$fR BWiac& e4-H&X`iXo(GDa6*DbKD\,vƼ bp _ `9.8B›ePFG7&ZwP0 :n1X p ${iq_w#,eXj"5Uqԍ9,H"1Dš!bT.agccWZ&gbrڦͧ|]5( k_\΁ҟ4.fW@Bz;)F9(ҷwKIʂ+gm䉕\<+F,G}2D %$/%Fzg浵,K%M,G#__.K[qF*CT7$j{3BZխ, CLuM_[`QVg ֊v&4NxfYZݎ>[D.M'1878¸֣c^~5cp{?ߥz&FCR~&G_7tNK L`}ie!Tζdܧ? 5ֿ<) OaZRdf~X 7:6EKũFgz٭.9SUf#8NK7ڞjw4wvkB^lSJu[.Q9Y[I.! `Z-qq@ D\.*ntބGX/g:[`g/e_'K2Ql[⎬-ezLw/o(M~%BWr{q'棙G.-Yua˻=gQlMtNGk@:48VH*SY5:)gsrJ1F(wc{@0ʹ?'ә?KSR\퍽*e]p5w3.esyjJvR Aht)bqncl!!JbZiZr| qBd۹U|)a ;cut3ۓfW] %Yh:RMVޑ\M`*Ȣ^ևho:AqE^{έaAKl0Vꈚ0^f?\^M$/~|}ps;GzFL`<n<_:$jC##k`j @#CÎVSl91e)8ґRqk4bURD.S7 < QX HLA !px ̇Rp-EiN7B2BIuÌh|~V,B[a%>5r+O.n B;8&E=ĸ,! F~bۇL&~jȩE9Q@Q _E&ÖP cƢ *fN橷 E:X1-\ӱI+g(cE{q1<鷮i! Մo7OKL&^m~`~j1@xAzߋ7t7-z*LKch?$&ˋ] 4T"B{_}~M A4]ciD5.@e8iiZ rqFxK)-M*VF{WAL<0;-8SpBN2PxT%3aNYZ!3ӃCF sBtGsW6>7(}T##9Y*Ca|eZHZ:>3i!^;Ga7A)3v>3`z%zqJ]ԩ(Ed*E ej\_7\ }EF帾'HsiqqjcmVSmO^wu8mk~ǁՐ#^yWIsp8諆h]zQzU@gX/m\u.N%xdQ/oBWTѶ.D0˜4.y{Vsކ:i/{7*U{Hբ0zR'0QS_l>.[g0$E?z̕޷+-5I%WY洳^ (:0uC<"2!DTkSWnG}w10ŭ3 ct-`pۤ:)r؜~$D4V0̺O<ϗPSTN.Ҕ6qXr 'hcD.F)B!9ͅc `CN~u3}=ՁRyFonٳ1ryj#0a/a1f"&h@L% '"9 Z)%@`2j#Tͽbڠ`-*X=a@S}-8ֲVferan=dE$P!Ibn)2@0 `-0P0ݤ'1` ?F1aQz. q7qX8<5یȕ^RLN=R}| fu_4N8-go_|Lme|LL8fjpq׎ 1_惘{C9lyi4!\rooU,W@=w p KԢ֊k$܂ wͤ|˛I+樓bT19驂K@+D-a:9@ qƳLS}9W/fx=Ͱ~,(ـJެ_#R p(憤~y1'X*bܣXCkkp&ո,*02XGK(HP* 탢 !@ U$R〉sA'繅Gv  )Z{su=zXJqzZLۭg!K:1]ߪםkl#( U:̓LeItNT"+ ,3V7# 1pavMj]nGX3}=ǣ<٪TbƸ2z?hR{?ߥ /J~@Z"jItM >_y})>CR~&GN'Sc"YfZH'z^$oҿ 8a\ӿ"Y <rpq)ST9iB^ֲ)O~N^D[] rL;n\gn;;5a!/D)X-ism {:±:SGQ*07Lw_Zj;` iBl%}fŔbʴCI3gY{+(!5G&P[v{h{oCqN s‚g!wVEƫ㬳s:X)SĿakVz]U!S*j=Jzz&f`)#'Ξ 1%XdcSsN%]ؗ/3ƌ2m lmOR"4Fܾx `}=Q"0U5vMr%1UڝḩQ0yƃPʔ*z' @U̇MDg${g)œReVGJlʞ/Th%O]lf'yZ5-FoפT](Ϛ򿂕v=@@Lxkn R(&"҈3R~ÕV* x*p>-F:,*?ʉSc-: X *)>pF`R(Ps՜8Sm4D`!dx` Rb p!x,[ mI2&C*I*OTSh(f]}' 8=_Zc81f==}Xog=<0!  E˘8b֝I1<e$XxU cE|Ӗ- N $CUZkՑrgQff^[R@9~IsԞ]kDD!Sr@fQׁXlc Zl<=hV.Fl8"G-гȧ*)aK*WJO|7p*칰@ʩsI'%xjK-'/?7,opo7?')40U *vZǕۏ#ǎR<3iϜ=ɓˆ zE0qYfa4?ۯlv:Eo`uCcYtOV:+􋯊y/%:@/.oM(F|sM @_t$րZ=gp{;zL ,4ۄ9:1o;}.^n@'95w\nB/W)?y(n Mfӻ*9znyF wgWa}u{w>,Sy>Q>;>F*^Z-: |? ic*}a͔rK]JIB`Jz*B^JFj!dQj;CVNi W2c16끼82^2|wa8ܚ T8$yweGp<._]c,hk{Zjiai7rU` +ݛ74,UgOx\hųztvB-Pzb8vsؙ/Fz`6ߠ,B֛C"$fك9J‰Naw^e's0moWM@IպOڊPRW!:SilN6U-30N[UsֺH Mr8c#m,:/b@6o&F)ScHhˍ!\?z\+Etz.#7@ H}$)IK_Igrmd ^;OJSH iD!"HP6|@JUeT1ށpduY˘L0?0 J(}pu ^EE2:nojvܷDivo,(?WzX) N$AMVJ/3&z=o>\隀v9H"$Dsl*lX9yQ#@ Me6Z2\ko1t.yǟxR wʳ &ám"@ y]TnEzyi_pUH5+ښSZ':kdOR6kzGΉ0Q"o{EGhKOt4b;ܺ2` )A"A$:zcDQ *j>.BU`ɘX9P1N!,S:i8*ucAց!%kPZkT\{X751"o6Z|GWb)EYNޕiM)D] ~l6k㡜`2!e*??%;iYPw&_%V2ZeZ)-e-? _5/D=[EAo{.._~:_칟͕^ )&4F."X`̩3 ˙1_ T{Hm~#O@L@*= pv\C;;A'uҼa|nS R20fRZ5>[{Tu#o~ 7MGd')Q p6abl4Q5S]YH|>'s8ڏ-jw-S148BJhו@*ITǻZd"_-|g+ЁҫE0Z Gengwh淲upvY(TOz0JiݹûToL/6wC~𽾙/7.2;[=^k]l=wۼ"طdz]/Wpk)ZK4xpۉY=ؤl?׮r34" G>YE5/{LG tzD]awFҠn#Qp.x!fg"W[i9?G$-3=$m%h&_ӻQu⶟l]&UBcbnϷ߿o6o?P]T}c￿zsKoI_{v#8nGX:L#Ou//{koH:)#u$<{LsP=*ec>}ug5NBX w-(B +fY[RKa$e a^6Zs4V @hCP(#56T#whVo+)[s J?iP ZФ$vEA*sخ^i2!Pm=b*J; v.L:$1Y-Fh(Ze &HMnJ#MJQ2}Ti)/Ba< s*%eAM^_" "Lluӫgɲ|<[m JxiƂ ɹ_)-;iRh0zՀWevw׼ȣUM@d4F\c)wgtt%RU/h˧"\Kі0FQVjY*>Jca+:rW[=*ǹ!+ɀu4R!ܰ.]f:tP حks0 ŖF'׉~ڛ%+ Z0w 77ٜW<4VyPItDCtRQʫ>\v~e7c1bO{e]Z(W[')/["10N"ulUV9+ƛɏVJ+*VA%uش[%oݘSsT,g>`Tayp[-ur{w(b|gFRa\POz"l`$u2ߟ(CH2gsy0>k.6;^l;1I4pyM[R4u^ݘoggt8Img' S‚ؔψ}jMb\ڦXa4pQ"En¡} !RB-iZYX@SCemt!dEzd#T ^JjYGO"յ0V;Q;` z{"ыubwjȼ9$f[[MLmmڱm@$'Fqwv :C-DFi[\*rxH(,tw38jUBI&Yg]j0O2Lx# ^2j~RjN)FQ АYedY:FkP& i F$]{I:u=-ƦjeDŽS`JSPׁ4 J@BBc1<8cQD Q@mY%~kk:PFJex5z+]hDQh!\Z'5H$']4 t k]V Ŧix>{Uץ&*5&6Zg7ACj%UָQ،]/WQS;:8یSPX K)@.P8KIK)h׺n}I)B{Zg)=a)<97F Q_FRpғRIeR*iN8aF!ʤtZy֥'.PJf7U)$#|{ը)p4jf RG 17yOuUrre[R7 *jB2uk LJ1GO`ݻXJʤ*-iK)2)%5VB+)&|9bsg?fk})~tS`:*5+D%`cu-* qu9/qv?/.||,+WZl  L!L^(lFw\E͓}˛Ȣ坟[8)ݍr5>ӖReR0[:B}qN\JM r["'1|>!C<𜤵"ڪ)Kmx>MC솙r ']s~> ̘ڿ]f/ז~?Nfa}9=I*~oS_EW?F"zpQ Šh@9yp~s 8r:hP:`@Qpۯe[R0!?ܒGNYY R`KjTI*D+~ лMsLs\[>|k$ 5dz4w?N:SWJ%bISWLa-x?iQZeO"4(EaɡuDnM.df+KS#Os^?Mtm37/($g\X$l6"ૣ-f~CI˒I^,6k3ߐEHî =`<8:{jA1Znov6}hV3189$5;zɧolB{j ;6 ŷ>T[Cbc;]fzrsƁ Zo9sB'qiʊj3OSZRd Q,x_ĉddO3J4ajxb5{C 1^'M{X2(>SФR,ʰi5R7ak3O<҂ܷ~ԈZ>0ؓvd n|RjLvP/qԈ&a*.!N)r *Wj*(%b$MfjIT'eXK}DfY.&3dWJdIdgDWb@kdƒ-QvZQx6<;ǒr(6B[W:n qBP5MhG%Y}lхV{ͽ';Y|5H19MrN('![i!Re&  d7E2sw53cpHƙwl$%2 B%Ox.3==K=&K ]Kq 4YY,4w S LxL X#c Fw޾6 6c"WC$J'4`D##BZDl/ o 2ϾsDGmDIb '+SːL4&l%}xRS- f@}XpM> Qf(Hy)  h$-P cHCpDНjt=q<g'!dAvu3N̊dI]sFs5(c1 f3P Y.Ʉd @DnL'EŸNr3#!ȳ82P`+10J8aP nyrHԪ6D xێDG4dȭ*r†(<##dޝj)IGGjWldV nz!)|4ųkJy",A.MՂQQ &2 YU (BN`!B݂B_tQ%e EXO}]Jl)ΩV˵dXC ^Y>!t$s!&t-a" b%'P5?> !N~Z!3hC&6<aj(ϹdYLn r±@pՀͨDE<̲y$,!@)^/nDnueԹ,;(nrĩ1zu 4OFuI\ڭ>NϟSf\}{2UY.>ˏՍ)j}h>{OVfy;ſۛw4W~Z?\vcxCww:ݤcvh5۟&oFZCN> AjO XIV)߇>st>Dѥ[>%ހqt~onKF'14hz =|w$n~\|?59$s ϜG`9T0d5~wrölPKܭys)]s JbZhLujOm,N7\.OEq6-ݾܵHX&H6f:?nFjttU69pg? X򛒙nKnң2j!ڐ}e~qUV<ߖ~WGJTh>N !fnHK#Y(H"[.YL ~-t&UDzu2L$(ro %s"4^K0ʳх/@]ɫH.M SNr6RC,JzNCv2r7v):[Ti6F(}C>3ՇzClihUv=,?dcˍ%p((^ -&ׂpfgȡ͵?k04L\  ֳ --&T;+kMX^<%k \O[ɸѝoMUH#nNmId=iR7/"k(2@"{q ͽf4_㣉UJV{IO-ߐH {z8Q! ,{>{<[_]2K.ij>ie޲6Q&51- kAhJ2sY,d=[J Lȣx6lLAG#/2 Xބ1$&1|. )XTR$U`pւޥ TΡO+z2.Gl*=W+7 i%>U-5ҮGwNreZeV-bߖJrW?OKKÄgb'<kY:;# 3dӣ7$)%)dtYz`>3L{(nmNFk1t5\ea#ʥ|8 ѿ"|6)ZZiن†K$8ӲČ4mӪ&{CVJa.!&Uht\ "qn2 85k˸e{5)ΙQN;-Ôp>$[dpʊ`.2ٔ11ļg4a1 _Ԑub|Z(Fu||هB-j%6 FrPaO̶!BF ކ42ϕJWHai+ejgϰK ,^ezaȉϱZaEq'<׃ ;ϡ90;dz+LzO{٫fi+*6YxSP2?*Mfص}cfhTHg,p9%E[$p!DUJUE.9Gݩ4eZzd|,'NR+Kh!(,H3GGE'Y8!Py#y԰I$Vbd&EWpMDRegP3aݩ&-d]2j(Д[EZE i#uXVcUK~x]914hSjeM YD@R"WQ̬$i H?Y-7Ղ5՞M@iCf)~v"Jkqh?wt_Z0m##p2Ue n.NK !}I^!J)UEq1X ߽ٯvAd $>^|N5юx )b'I?y@J3Zɶa_lW; kv׋$ZIB[+52Kp$ q~H4 xS /@Qg(9֮m63hb4٧LWei0!,e=ztTpK*Q$rmY wAuӴW^w1n_ެ 3φs$):[&\XWj+bBM |dF=;Yν3&XB1q  &ҟ=`VbepA,ˢ|0asJ#H~du3y*a]_-',M-,+6qFӈf`%k&Wz*/FlF#5 ?r ܟ bv,]9':~ ݛ6u>P-莦2?oj])Z[{'}z-Gn7$8s;bq^wi4xWǙۯ~͊5i%-Qu,F][[;Z5 Mۓ1JfQLX,۩㎮fci.{omhC!>xutv h}J*g>-L-Q$+骭X;); $3܆ŒsX;-rꆅd)$ }̫?td7:k2nL:XWPNr%A3)&`(@&EL\+g ljQ-XytGƨ`1Zaב|DR)AZI2E417TˆTe| _܉OW%|BcrB9"`<9MY%ܮjє(z]VףLP$Z߃?{׍s$x 0Oy``xKR$[M"ϥ%w8VtU}d*~tGϧ>|B/2ښyMw.ܼ[⧇_(r/=~-}ɝ#kKTW"<`DzdAb&l4S˞wIJ#fPq#/T( :uX.D!0$8J7scH$M`P2܂>ԽnXs,I9P̗:T/:v1v,suS4E=OWޔK  u!iõ $HɖfBt 髟8N g<^c0F6L/6|9qkWKfJrVgⓧV6GUG-JiLY73_b1S[ܭ./Q&X(LMr@l<𷯩#U\*cCdp/W}\"gg?߹哵ErW97ҨW ]N/WkRG['hvy] ^-hZ6bvc|I[U|ڿrKV3V¿|㋟{w뗫w_|uOʼ>>]ȼiy p[*"#dl>r7KڥoyֻF\v c-cv!RijdxksIcSqBAx"pݞxHp!P$;LŃ!#*}?(ֳz[JK^Nר@GNZHƍe?*Ȝ rtpL^ ZۺiϳjA=KWt+5 @m-?ͽ,J?Xe+QzJ8j$ >*/U9p\XP&G?Zk˥ăR6dUd\#[66KCQNSL()( xr2Z!S #CLaqZD=ܻعřD[,4nU-G)7rtܘ }&]!)V :7KwOk8aBm`!/8vR!4/mށ`L!|7ׅtyɻ̵ZkZ f^{Yoc,,}򳽍 LI>ĕO"ފƈȑ4z; fryĥUL |jb!4vLjR &;ypD@Tyʊ7졬hR^d+YBNBvIaO;L: EW (o5L<&N^5*Qό|畯yefjI t:]2>yiy5oأQ*)ξh+Z-K8Hh|%Ru^ Af{r4G '3l yĐ3h9c@@2JA$gIGSM7ؼ\s@}LoJ͎l zٝnXkqA-4N)h+S ̔M *KD27`ѭ*&9/ !)F%))fKa =s>܌3{dI4IkV&R٫p &9 Ji$hCށv@m r vGbr}#&a3S>uzaSgo6a39`1\V4`} ;;mec\{RNlUmp6-ۻl5@Ҷ m~MZLi+ys eEubꉛ'C=B]yz{5fgv)9tm?pvBF&^=}[ܳV=\KcB&DK7Ay(6_O}8ئ$nÉ3)4!ᰜ ]3=9Af({puV Z /}hJJj1Ҿ-kE/{3'h k'HAXn(^ y)e5>%IJl /-JULǶ:tF;ɟ`dJk*o0[sb[A&@&';Z/+>}뷤BOntgӝvPޒ)p;VkIs'lh|MU8W)~H񥦬䔭vGC/śQdXnwa=j,7!Kx4s^%,W`֠UF!"~lcߋ#`ŨS˛mWٺˋ2uL;NX4 c5Xa;4.?)7όw#ANh7kFWc'ZW?yY=|n>cIt ꌄB8Ϗf6Yv&5߲39[ ,̀-I%rgUmF:^vZ_՟T!;8(Rj+xm:sۄ0~rpZM7;O zlnI N7k>՘(j2y1OMf%x51Rloܦ^ V{r[7ixXU^.k^n]ncӼy%ס'}UL^fn1hW抣 EyrP'Jwhit]n]2/377܋!n"kK#;'C])!f응|ξK~ˑoSA{( +΅z/@V٥·|z/g+d׬<a˹LxGD]^6 yyXLDB`]16bR$2΀V椙[3Jˍ:+S))>Fq>(j\B͠9jCKm\ ouY%%.+I3\QTvu]@yٰE <|ԾWl>799"dPblN!@ѹ\(XcI 7ª%OqtV켧 6l&ظTG0P CqCk6Bk h0H&.B,ɝ߇ .q #L B$j`GҼ;_*-ְ(W6|yH(Є 1rr 'F0f22 M*/D1sKSE@zG4PijqT'Y0h0`ӘhIR~jU;Ź>-I\d҂u^+ڲ6Frlɼ^*imVI\Iakn6[y.hTr-W\5ʑ%{Si )q8$%`HqL4OKD#t>| YYdKzMVgOVT#CXi-LC!ej'SRxa\JFdIȲs^Ġ9;{ ï7W!KLIzZ\ &USV~f X*Ѹ[ls ?ϪOayuYv vZ2\˧Ui?V7?+7[$½wYk` ]}eggY򡮣Z[ܼ$7~S݉ en=c)3;a`"ʶ 500QjfqVR.(S:*WީE y.k'*zH-X[$Uiy.4+t tt.*4(.f䜪nRWuR!Ҹ,DVUVLr'J'u) !LuC28du #cY=7M'u -Ovo-u/8 {>RgΚtÇޑz-#|f:±s4P8Ч aVNf3~$H;}ۉ(E HAMqsx}&$F?ChAk=6BW-lذ+mv+l?ˈߦSHЖhj[hӓkԤhT _k3>޲Zl pFќij*"5nШhID"@%d+ Mi`pwGjh4?X;a$4O}D 嶱K祿Rk!eQʎKi[RGf ;ւ9|? Geʤ+Kڛ_ _~w~+|5Ym !i:]Bzת@=9&4@ Ue&ea.P[Vr.ώ| ϼooea:eu>>Rѻ Ą4LNTs*0DΧWk͹(IuZIp_Шa2 5 b}a/Vsklֳd JL6 A$Dɏ9k}li=>2<9-i!y[Hsii̼WUg-,fs ݽ#k $- MV{ #eNq>z`?g9ڼ6t tA\? n6vQVLb2EBu=Q4sT'&WC~Oѓs+Qͽw&kL]@+s-+1 G%+&1g~'|CwH aGgpJwK-*ZtWw~ZX>w%⪍j%%^Y[TGY7ѻ#VK6Ōn1(m:m_$ 纋_x\*[e~"C|WsUSV[1mqkqq+v7Xw)f&hoPwh=J  кΔ.!˫ʌaJ WTn@6NCc>|?֕4 S(F]|]Z}N粺? Ro|}ʕ`N*ՂMTxYjX .-MB~jP]zNJ%ȑ//[܈T@!es)$ @ ` Ɏؽ1,W9Krt X(ʽ6qo )%Jʊ˽[Ȳ*PsV5Bq%B 5ڳȐѢsm] 1ٰUKFh/C|NsMZ=\!I ҇ŽT)͘/q0sK"ZM[A0Jq:p[E|{P%”6gPb mZ]z;h̭@5l$@q#&l ;7 pCKU=Y`iq8+8I#nxnι–3jW֡h:WwMŵ~zr-c*4 í:Zl 0H'qԹJ!IG WAlo@&qXAxM-PS'皜?HO3u׷ᗐQ{fkw1_ޑ/?f_|EȭIF???XKb nq&/EK瞳܃&Jt %=:mzbPQ}?t=Nf)h_V7gOu[xs{~2D8brxUݴß*M;OC/dS5w9E*QpnOiwWK*p^6$X;h-OܰZFlԽ;4u-O/`uwcRs9y$ Z߷m5kL`탈&Ɖ 6 P*@T\-ڻ N7XD-\S{^ BTQFolI^lljЩRohsKc&_ϻ_]B^qwֳz{?UC낤fW>E{ͩM6 к fFu;E:UMfnC̐|SDsuiик fFu;aC:mڌmmnC̐|vS)nwZw˘7nI<z wΩ\x^hr*P^ˡ(]eE m?TL@O[8yg-P1bIp[HpOyV;cw2ݼwbl6h8#<#̜!Jt8юfy/a^z9y`QqdqE-VWBd}Zm6=::=Jȉ$hK>YL\EzF2OPs5䞮U#) [cWsܖM4n>I26IQK2n~TmMutɕxX5Pt?:"0D4g—mm9ҶZ[r?ӐsBs% bQ1fҝIQf 3ED=}@cCDB[Xolq 6$Ӌe]7u읫`BuRagS5XdX.b,bՁeb`v s&ءd=Xa/{uNTHNHy\haЎ t\Șj?ΡJgm.T Z**ΝGcRo_H&P×#Nz2ڛF/_Bi2 ,c. Q1Q(- +hQK"tA #K͝73[j\s6TBUUtPxP#AA{EB<2dR4 '^~|5AF"XZpeF\xX<,E2P?Ȟ 腬qp9Z?YQK bj` y41{lk„a4oU \.8#)( ΔUa}Eq$6VE%L6̎w"rFD>GA3djuH%BAd,,p JԑQx1<WכRPl\=ēichA%! (&^&Ol{Ad f;ǰ&eloW~A6,d[Tuۅq}N19'v=[}-3R,igOǧ 9iWVxk4?t _)o=v}{V/Ll᫧eEʊ jN3Raz8k٦캄̖3y U3RUh|FؿjMCB5KVҒ ԩʃ,لg 8'p2HJ2.mN:=TlI#~~F2?/hpٮPc4@]=B=E2b(m 2bۡ4 ~ }蛆!6](=΃N#Lއ#z_ցlߝ݋_,SkҮ)Pg\_Flitem l5M0ݍv|ɵxXa4joRv:-6E$k``tAbpl EwE=8A кsi22{Wƍ K_n xilV\}ʖ fBZRR⻺Ԑ8ƕRlq8xFOEV4ד̦Fe)L" *hUJ 'f b@c~(IpZ/b5.ۛqQleOfN@ ̣] ]1DPӜDOs[ijI!EwrS#vXRh7E5w_c9 Cnsz|:foR7MWyͧ+sn}&lBein\;y? Iaba ՗q\El[߃ۙv*OXũU٬=郹MĿNrۮ!D~V';J"cF3PcBQ6\qfcZjdmwu&'[L.Om\35{Չ3 W!ٗ{؉=xm)0!XNJSd< p 78n=̘J5'G:#'38+h8 s|UѶ(ôx=F>kY-zcoq1-cgO_D>ԑu 1jbcӈR2^)ݘ=DÕe׿c~6{9QytkXK eܵ^ofbww׻r 4 P[G=Ӆ:&7$hV;Wm6:I9/{> )BTˬӢKFOvT"uiK|Eyqo!Lbl2ZWؼmFXHncߩlњf Qg cKOgVxH%NMHơauH! j =!7ȉ ^W60 \0S? cZ 2rk8m;s͈E^Oٻ}{EwxE$RQh*B0Y *xEJ H.JsE$ax{' E%[ԫC,yU(x1#9RJhSWPyJ PVH*)" Rf耨D5tzK{!! = SԽgi&bE Sn=FgϡTn=iWJGqϓݔ"fŒyh,1H9iZnw{j|!0ԥ[ЦL78Z[cdFtt<nqk] E?(EɱkN $Rtt~X_Ivs6INV?RǨ](H]|x[. Ǒc.qd^<% Q)G IjE+wouz^(=L tgT=xmhBY hO67t^r.p-9mpEĦ[3TQ QtYC&GHX$ ^M?r&ѹ*y (Mb>$JB'bt=g}pGR\9tŠ+Qzf: l:K|ef+X3 T&(ƙ֙Yƙ s pC2{AE^#(0:G@unCd ]%t@XY)Z*!*ǼuG)0^ QWxaGC(ZxQ| /}D+63zάZTkȍ Rߣo/qhjq)h*T!5 U͙D+ EYT҅=Cil~!@ yAI5Rz.f[o̮5\{n8'qpvi14* }5K/%+G1m6-+5؃ʱz8)r$-A'rm4O K3cV8lރ?VNooͿb`\s|Ҳ_K;>ӚKj-ϿOQ:ŋz;+aOȾkWd|TYru[:=91+ (H],_z ",Og{k 䱔/.r%Os(RJQPD%YN0QPX<7k2$A5-2e*?'.Z:.  2xa1DvA0[a$_As`x刨0Eq4:ox+EN>r6N&}e]c)!91Uv '~e}c巓#Xra10J9 ̾G~4{d^4<%j˾fx+0{Zzo7̯RhHa%%ӵ:vhb7O(yu9U4] ݑ N5|^{ K0Ԕaj?ry&䰛yH7;hTP։,I IPFjO3Q)$T'ѓ3<6O<ɧw"ޔwoO6D'7&рM.(XFSyy\Fw%8@XX<ÉN@UDpwG9'4^gjd_zwt 5Bp)Zl !3Zap"=Z+ڢ%wn]՝g?2-Ǵ6sboT$؝jY.JAok;[oS"ݮeN 25eDqگ>72գΉf73*Kټ F٥1M2v<#MNyL󮝎#,FQt^سA'>^/z4/LaHk!ZF=6FC)G>AN%COATޑC#:C8㜴 җb fQ@E2C_G Qd Cn*$SԗrI+_CmR˾RmJ,D H Dx' $ri;D1<4k&wxNA +[smJ*'RY8')Y߇웸p!gaA3lƧ7 >g3Q)dXİJ  bHKi*x.:Ķdz@޲͟:.zq⧦DRW#bmQq~WܧFxwx$kr[_G?Oo7Dq?jTM G۾cP>}{tb6Q JL}sKQńxq?>1,QXb_ss/;$fִOLt”l˿|Gxn0(ԋwa&TMzFs3{H]eb8feV7D&ޕDlAK/OK鏈hnb%NR_oK ZzZbwg#ˑNwv <.m ZU:1)kbY^!B%^c뀈ގ9N--2낿:}1 -JER&HCrc\`ԟMKg7I 9 VgҼR67:{rdzJ)bE] $$=E'<)bFQ$q7\ӪK}y,Q"ؓKmޞˊ]\K״XJow Yvwe282/lC~1-[eU R`/~w~R'tkV4"{UE NSbPf/'lYx K˞}? hz@qca{=S.G`I֭{ m::bcrA! z0Mi/D\JԒ} {2l^{(C ~՞/k=p ߒ3f^ 8&'|-l>_GIBRi{;\&6Rg([H9,kŠmP2|(yQ}5#RO0ND8Sl+RDb*, ->,55.:BQHVsVT -zIu%}CpzņؤQf+JY%Y1իlb7hԿsa~ 7߭{XuۙY==Y@XތOQjenABxV3[YʠgbkX.iG™jz8kO72ێkԾfpCu?-3mu][o,7r+^,0#ުp[ FiQ#)N{$ž Gj=VwbX,~eP!Լ&kYǭf2Y' 11eJ9]UL-rzeCol7xSENzW&Lf%͔43Z* GIo% 6kE Bnɑ81;|`$FDr=?g,|@fDGۺ  ?0^ZӨN]?X?-{?r`H0Ubhًf#?咚fCqdsFz Q.2ɇk<۴}5J_J|`qV(Xh3*%uGl;KuNm'Ol*L5dxJeds 510]ahqVKu9Y)[dQ+~x.q<+-պTsJyxVKuɗR\l]tjXɋ-۪@4bVhh)!)t QX:9#xN)DF#v(AK4*6l~WhP(IRCjR *v{h5ћ]3&=l^R]Of1zHo\@Jt(fѠ%(4K CGCO~8/VMc$.D_/#v ۰֍R}A|Ij MԤ}UahLJh4vңձiAʮV(׏ƵVl, vrSݵ'YfF #_.?܋!ˁ ڜS[>eLPgrx1mz t QXoHTZ=>3ĎGzךQoƧ9xQUmÙnē8 v'=D8zl4DV)z>8Orh3"'Iu E\v;iCazåiw(fv5#)"s; a*OFFBbX܏^栭3ce`q QVґa& -%cVw[0|` ڠ5I] GyIYtrkgȼnfp^w&l0i B[A)2JR,v$}f3 Z ]2J`q}^rywBq5`#$+#Ђfjn|+Ð:$u mEI j]j?|5 4Ѡ4qV*MMڪ]R_K qғR/IX)DR WR 3Y [)\Sx.X)5*ZgZZK1m⬔W#4$CVRsbӶRR3`G7f2`Un\>|v"0X=an5< LkH+s̸S ߏ SW呢刔G8 LRS)UkRpUufH}YVט]llQocs;YoKԡ N~"M^?=:i´N} gnyF ^lp ԝ:` m[ΆR7}FjX P٣/SOGd18Z#ˑd j3öi.%hG}0D,kK-H-9C~ԃ~o8V,At]A4NVUp{y\Dm?ޮ^Y-պEA˂̺D~1Nhwn^pCր8h&%QVvTIpԷ CKy9׃ȋw7N`o)F!<¾&@N G_EQ',f~s35SəFYeLl7^[q,#.{{j?nJ>*rW>XxdkK|{'B\*YUJIcN1?:{ pl0)Fnp`Хg$ɲ\OqJ4I웢qr#§$?2lwi=*4"9iРf`i;nʀɟSxhιHN8Wa^ºaDT(CH&KȖh9']atjN+[@F GFRں8\e2Ŕ0+T  ΕPDx!ReiP]T*wH+ɯ7+XVcy,Wp[[r/(Pq5gHI2\4Ii\p$\*$H ~5J^"t)|Ywd ^J qv8q"'j`%a2> ff >m+UqʗR,ZگJ)[7[Lx.XwK Xn֥F8+5iX)qVZKLVzV*T :$tWR+Ӹ0Y)Z8+XǴ]R_K MT8+Uu? JcW&J+Adl:ӊ{}V#;,V ~@mqVtݳDzJJ:,VMvJOJ7ìH7</b!2-spyƔȅԌ2NZ6Ǧryhd`ƭ5N IK+!͸U^t%r2[.SU(msʼg*~pe.K%i +1Da5B.:" ?&LԦçJDΪ. TNO GKVpf K~mP_KF3 )tLidEqS7E1gy?|/+k&}% pMbKu3<7ׂi  fE,e:%9.(+ YyX FhvVZSwE*yH@RciJ UVj,HGw X)ŌIPUR t]7Z0B-tb{O[}Yvc4VA -z 롊DSzkF9Z\}5J_J|/ބw:Wf_sg9a|z򨇎O_λqZJ.%C$_N7ᄎvqQŧ@|.zPH͜q.UUzD$U+>>P{FUw ^#jϲk3ߗA_s]<-C ֫jj\j_ĪE6C9HԳXyahs.ˁ6mLp#.k>E;}yy%6$y*j[Ѿ=l;L-F=ۧw5}>Lha[(ncZQоM۶Aѽ(KpJ4µ}l9ԈG3h3ym'-Up}g)P0:u{Z V:EjHqjPh};.13FX?ֱrׯW?߆G @ڊΕ8K- HÅ KI˷-z#b4ύOue,d}&`/Cc/gvh|dmE^w} U?\~< 2wOiq,䓛h'L>|8oH}DU1mw Iy"[M`]u.&`N?ܺ,X!c|,,z__|C/B>!*xsso*^U( [?d?Ef:˼!-l7]_J\z~tgϋŗ3_ˋ .Kuf&=[؇'*sv/+wv_/Р{S/garSmpF__po^ {z]^#s>4 ,8d1znnlulyRkNHՂd>T99CFץ5g]t0\qn:SԼuj^-z)jЌV}\$>t]?~wkݎYźeW9^=x5|Gttn閈R&t+~fthw 3|EpB%t_<}`|!%^ꭨ=1:\o'-p+(-[+꬜4r#jIvŔBcP:>0@f%ceӇ1 >X֞Ql 3Y*GD9 ˤSV۬pZ2^82,M}'EJ,y;>, iOu6Zn# ҆sޑ{'"_hނ.g5T-KOVZzfY邂o'˻߿oouMq/ts}y!D7WzYp=S\ap%4 P~||כÝ2tuC,?6$o]@1@e1/HPyv:sѤ}HÑ >86;.#m>THQ]Ďfن,6*.8Ĕ0*V Q0!$ZCIf mi8/4AR;[2BD) %i, B;,sQ42T Q\aj%G0އ#fy2+W7[j~]_CCy/ݔ҇o BRFd7?m?>֎ X6%h.ܿ+ }R DףPcK6S`qTʼn!^W`Lq;RA бrGo;X$**P9: |~"*8xq[Awll%|31$nmٽj@x(#чl+cVkx8@Z ialIKqބ•H{ZJo]}wcW<׬V; OߗUXmq38H#g$ ^z18=}b7,w;Aplz/|03Qϩ^:a6mի T:\vִQH> 3[e!a]ݖޖa"| gKUjK? iHŶ^a`oo:96kKN5TL 4fs: \\'%kPs (Tnasor|+g靏痻}hc8x~0L;٣qJ^ S- +Csɘ9BV34ф[ڸ;hCrLXc";){'>CS˛s#ޗA^eu.gƔV{"q 0׵|HgK9M}ObPuQC!/qׁtG͔#FJ`D`*" 'xy|w)QhO|ww7bd1fKeAhIP)!1[Ji[R]Pi"JozH_)ZI*JQ)#M #EY )j(-t#SEۦz* s]l0RD#! @"" ?H!Kl\p]S]`ZJ1Dvk>,E+`c쁷`cعH N˘T#XGRDHJ1… FP,r^!K8Qm (2$]pm o% )$Ѽ9u}kg71uc=)RՃG%;5{'Z(2L+F[4b&@ $$z/kp^J#%`bL} O]!PvEm: .D.L(ɡ#YWxpI[4,D!rdu`wX|\9,$+ $$" {@0#8d>M-ng.}#/Z /"xotP0A)Oމ/vP]Շϛ:R 7jKg6<(-jkXr~^ob|yˈO4ۼKwOf7[]hV暟n<=*2*"bb󯦾Z?78 ?# ?lUͲ5no6s]ص2;/~=5NA /!;,˜ u8l;H*'VOP2®O!TKV~ Bz٧Z8[ G(;Wv?>zwa0Ȏ8׌UOU{]R}p~v+~v't]];4?! ^ӻ^ԍ箍\ܮwnc+^ͯ S|hp@ MÀކ w{C71S.w{ņYP={vC?٦&髻P9;HB^ȔWMUjr16RxgL凞Ej.$䕋hLUYOn"5햋A䶑\EZ)طvrn]H+њLEF[P@j?fWzx+^..±QbPԥw}f,l~vlۮ E@X! qr}~7uvXgO ;k.o ˌ][W37' QPA/g! F?C__/[G>%56%Yp<&) n,ZQYQHpUn_\Bm`偔'3@ nO2Ơ\A+FL F<3ayS)\ ߵq c%2' ^.BӂEQ|i =`QF?g"8oQ=_ 8| W huk 1/!LZ沗6a7w IF Y|9>G BN!ܫ@t^h#yOmHJ j8cİI/-&ܺ3 )"b8ъyӢHzc(r5>~5WQD E(#ɘ&2"Cm<U"7W8̢'BezZR ' cl "RH_M ?g?z9 I X\w?΋+OMrc/a?K؏Rݏپ\-/XA 5/ȕpvDЂ*U"!.eĜ+4%|ZS4鮣pw6vh❃ 6曫G.mv5,8S=Do݋1[tQY3AO1:)\ Pu<.1Ea\.@yƻLTyI4 8 j$R3UHUŖ9-$i()CE;eb1="!F5spO{ E%Q0h/!184F:Q", F *91JFΩ1:l BRMp1CD-I j8>K;;ѬP-_xтR%-WX%!cuPh @x71)5Y)<@K`2\"¼p(wEXK B缃dqfB0*Ȗ TF!Y 16fϟv1JxsmwuKT& EzwUgޗ`(Ri8p2FQ4TGm.5w}o9'rK/vhAjHPSo+ L> Q\;&!|k`}gsdJnm~IMx zrc%%('akU>RrJAKd.ʕIjW4Cu]y f1j AH=ѳ.yRB(.gqy8OyS0CJ,64'>!^ŝvޫ^1I7x~tQ*z2rrAQ?yZ# |?Td5{}]Wv{$Խcd?eB Vi);4lʧbǨ~5ARj;i_*Rm+D=_w7A@)}]a*Y'3]F%bH9U69WOHe0/5XvƬsՔxOz鯟%qIR|xLU{Pei2 m#xs VF!,Ѕr͒J^թZ}*#j\ bD'69h0ݺW.92UMf1Us_UcGnĈNn)mUoc8-bgvBB^d*5Vc%bX*2!)`!޲u8kW\r1\(h#^.F=X% ]LQF }nfx,4u8R#{HF\e3:fY8;σSOZŇ f! Bᢰd5>4Rm#,4-*GǽGA.QgEA0*BͿ(X) "@+u7iJ KCD<{>bo '|g_w&@>yl  06j* pȦ>1BAclF2>^J7V GF2;Mὖ@$.9u5[Ĝ涑N0}r|H^ [zƭV0ﻼqᬱ\~ﮝ6Mm.(MbaaɸyL_&^oap3/oGRmfSҳ8+s7t+׽ihbҘRŇLEƔ!=Zyd/~t)ؕNWѷ<[~|^v61eԥC>i06<9ncp[xr[G Str #:`MkӃ܍n $1*)S8ӳ։Ǹ#K%]P0MN6l86B*Lƈ6nQ=aH31m!޵5m,C9G$~Q):k'). kwIȡp!hC"8z@ ]<=>'wO#qt%?;݂LYxDN @]yDqݢq֯n'frlǎqqU6ێZ@5\:.eY;kjjVW֠=Z3ݴZ9i5^GƟvjAoD&# `"}pr.sP@2:&ZLAzNG1@j$@10b}ֺNM?룻v<@ޕ7'p_{d۴oz?aۑHyD,-f۱3q>fI hw_VOsse#WibbK yʀ!G&^Cݎ8=[*]yNid:Y'vB+ȣYlP9 iV[DqVd w$NVɳ*W$(wb,fZ&, EH !V)*/E'*/![*%RuU\pi"*AI;ݢ1 ҽ9rP!6]A #{̞ƒ~QZGR bD]?zƳkgY=KdYnk|΃u'$,r&jLЍT{Ko^ @ͷwLJR,xhS{AW[![lZ}rgfCYD@MKKDV6XU^ >E7/~ޛ 6|hKН'y<|[~N2ޒ*{Y%9|8At |]șCw-#RB 1ii' A646v[V? V3A|HS=YlG;n`=ȝ=!BP(v"sBXP ةV dMvMciΌVQMv6K`rRR)PUUY#)Y;qpvKҦqej#cĂT gB Z8T @ʴi9dDa{15ܶ-ʇDRY oBS=ݚSք|" 04T!+-l)޽z|rLD Utb٬N ']UկXgW}8(*{u8azpq=zk0jjЬ$C5HEN+/Fyl9DB~IP@SQlv1G~zF7 Ǚb~g'^[?s]E@>ɶRKIxA mzEfht8ѦFĬhxGcǻjm$t2{ݭzgZORi d2[ B=ғF2s՛.m]pKh\ѝrT"C'%ެ"-_7ѫZpElإm0 *d^ ^7 C{voY\4谫Ae)ML+OK1 C'p@$ЊO< 4ط*^DIVR#o"Yaai,(-vo)G2A:M%w ? uR!tB 0\w%dx]HnymbSob] P6ˡ?YsQaC0Cvr(qt.UW2!Wz V"+r_ҖVSyhC]>Qd|5FfVc0;1[eY3 S3ebAӯ 7mvioh}PZ8οI~f ,6f\[Y1o LS)kσS8Ot}m܉omtA> x)we+^d#{=(,e#?}@fc L #GI*~w Yp%磽ᾳ䡴'2E`;MzFJM!XI~o&\NJZJR VH>oQ-Q'-vLTҬRJTNX'g-I)a.RJTSZHII)VRn:qNsٱA{"5Ƚ#d4l> 4ZM}u601ZٞU@wLeY#QEgb9IJtSwQ8M{>Tɟ䈃AEz E00F8 ׽߱<{C6o>{25D4+z7A8K+ضÆ9P4Rӓ&VG}vN)V NV`EvC*xnTD"ECUpߠsN h hF~oF\۹n/2=*9g; ?Zj+Q"&˹+*(=u$՗#!8ʕ.RDugJh#{V =X,`,bPfTJLE~&BHGzAG%@N>--"}pBG0 ihe<~aZX~ ,&>7mt8d vUd [pjn^$jLy:#*]㟆_?콉I_;snY90*4n8؇+#-2ٕEZI7[533%]Hrϧ!_e'Mлk~ uzꍓK1"\MʵrIݵ1Gp8V'pō"\n}8L텩xrO{WFjxtyҋL&/^>sm-M)d&C:Qo{$E8\rLCyy[|χhq]H=@\[3lnͰ58loIӠ_LP˟_l0MU8tPV̟#`"y"aҨ/n^԰C$ݣݟ#:AOG&;A-;AX BEf3z!w)gb֛gwFa {~8/g@B*0 Wb8J>!-}(?w(tGG/#'wo!^C8LD2;#V37Y29in.j@9e9Ch \_Ieiǭb.eP5GxOT#Derڈd6UGʷ~SuLWZ9n9Q6@ mL Kܶh- |_ɪ`FFEۂG!*% m kDuM*XH!p>4<}&Ѝ}P=[{i}4.azFC9oNjvs'< tC;\ lTT9v|ba%QEY6sm2Η%%{hgtAO8 *? ~B1ђ}8)sQHSE;cpՅCgm2prʝкxt[M;,65 :k8 jGiq*gVwv3$]G#8X螦bYS%BY1i:}h'~ԝ{wFe3YUFp2ߩ&UtP&i.|vfFaY/޵q$B菻K69[ ͭ ;4pN\;kOu8W~ȝ>|h=\ !L/\xwQ}K5R> ϯ{.V٦UCz:^_?﫤|r4Ӄ&ODs3z<0ĸ7_WߟILPE6˹~ Z#L n"j7 xX,xo@&J Z 1\S&TSxqyxbeoxAgnrthrcYǤ@QAagh)OJqR*47.SJULђ.= C1Ol&&/Y)5}gpkPѶ62P]F]駬<6 r4d6p< bPLaWV3%HF9neR3Q}:b΁wQړQ: +Ve'L*rkŸ+uk^hso>k% 4~-Y9*o?ܠ?'P=h  %A(iUD(&I#Q%RXbB-qHJR0ѸN xbq@CTsf+kmc)Z` VE&$a1a#k93;U]H5|;/iZ0tJ/(ݵ'>_˗Ğm/ `T3CcT(RSmCjND+2T,ʹD72>  ?,F 0E;'4K_3>Ҕovo'p:zaM6sqQXr?m&VNIf ?iϳ_=PfNx{0 Gȩ Aa` J ~aC!ov9c6,+ܦr\&nn`fri|`g ~' '7d n)*)i[Mht\O]](GA*Qm8wmg5hט^]ey7yCaRJQ=;վ\|uT럱*B C5[*X*+P8 ]ʺ-S[xֶtFk83 cՂ(@)ݨU!bc<5 *C휯V(2]'KwSP\ O|-_ljLT#Eߜq6F~^ a싺)w]d_Qۜ?i)֔}L/Z1=̝&Gssed! d2c!.K#e IZ2UVѩ*vdgڭ~ʪvkBB֐)W-}nu1諸NyВugտݚ'.G!Et2J=N?iyQNЀ)Vߙ\9rbCnf|vŃ7 gv&ū]5򅗷EKϛ52+2 fR@q#s+0%>ULQ_GD ,)X#o%qN)$LђUV e6"F%ي0' s#J-tvw||N:.|6u;s~{0z̫"W}~#. tfM@Gȯ42 ?ʹMoVT-a4MYJXpźTw{nɚ2/߭9't3Y=*ȺG,M E|,[+h)<,'x405Ysvau՘Tޒ,{7ޚi~\=z=:-FMn1=دdK}ĥzbYwz%^1-[W!G0%NhEܟaohe ѹ"ڄs`& +&bm #Fi]]'*vN&*Lǩ%24BxdSn*D@TQ)e2N,BAR; M$mkYC4a|dMq|rFo9,q8Yibf|5u=^fq7v4d$Ci0 !$s4& *fb 0"BLiԞQ#PĐ jDPFa(J_K iJx,ÄS7FiPC*eց3|OW=v$x|b$~zg0]Kj7Fq ja(sW\b$PO+.I+FRW Z='JI.IpSs\Ӟźn3~ģ\Bhژ./:kl=C)m6icTi.c^O:ܽe? |I5ԁhdP}L`S92s=WTׂY "K5Sc=;H;_ 8>'~;UQ9nTa㵉wι&T䨑Ţ;.zD`'~qU{u@hqOfP(8EK{ pIN<я LѴ? M3xig@'.du݌^4Wt֗aj1@D4 HZ2eoqnLVѩ*픧&Kns=[Ehi 7s=Pg} %̂s0Fxjq7j(Y T:*q׆ʚu:mOΖj{UfJlL9~ {X"MoVʾqY)ISpe}w5\??Tto^>⥿?E"*m4$ꡈu2(JfIIdXW8c) =Wh^]PЋQqJjX!V;9L(!M#!cyT  gTpIXH3&" 1-^dNC%K_0 1UeGD%q $ 2DE6D2a,aèNcĄԪ4CMxhx(NS[AC)C )S`Pܑ~X՞aV #PjMTjZ2XD HXDF:5ieRK-(ISLذHa5HI;+v8N[teb2na6(PN!-)e A)M0~2QA)ܫpkįxXJ$wnѸUxZ>2x P v,ri3w穢t۬+" pc{"a(`b10x ӖMwRF@M@B8zBSKyl)iBeMLD5D]bif=KbX|>jM:30s 1Fz38@HHwєK׈vVk97iϤJx33FINʐX ZvviI&*y8}}f^UCexW R\+1=|.los{o/8ׯ1挴sUEX4 f^*dw:o3Ml {|2|6dfzPaX\_!)\m>X,Bq^* y?I#BعM6!9>$9 @lH.IiW仧zHI#f8RkZy.׹p%>K_=yl\$:kMP?;#ĬEqQQog;o/M2Vg:j9XYծQj$3]sb_f6Xe=Ez,pzJzfY=w|n s4w:1Gd6y|6/S-]WfU-]C~Fwv;3z^۝7'sDO~iRGѾېWgUΡyu72;K^+`D"u/g 5mZ$DpG}3c+\W*)OZ!e"Z&mɈ'h&h >.LfWI'5k2cCu6>|ٺT'o(ҀH Wnj+;P݀ TҊC1ѲUj &Tߌbz $0-nVrZcNUeNDގX鑜2{Lg'GW3Bk~{][ _p;@wuj ^9װt\t=)4?FG~g64A{F>wJƏaP;HxXjo)=w 47߂әݪ;mGCY7^˱\G 1W8e`cH1^5d܂äqEĝQfCL#4f ևg3j]qMiP[oA#OJ.Վasˡb6$w.dqԕvDSnMiPEt~v둧O$SLznͫVnmH\D;}v$G7Ai:^ WnmH\D)f ն_] fˢCSiBmrs}Vcl{Pfk(6"%5ǮcɔC@=52Z`0fRXjSEp)j> .4~_s  E6"CYPܸscY^O(s.MPoY˽33){c~Q=IL@x.#*ׁzp~. R abkScyn*c;:X{bYΤ|qkM'^w\4;Y$&pt O66Lb]֣~-n0lԂ q7w%(іj!IYpfn*]E+ &NHv~!bǬ*佔(<F'LH0Jbn4XLoFxB;Jb0 sk-<%MHZ-,6O֡^8}m@}]`P(4 837e4Irh4P5J*5\M`hF}^U!jW͕a(}dF3.D[a)$4}!ɢ.E۸赸ԗu0Xwt+/aJ=TX஡eۍؼ`ji?O`x1P\/`~^\`8\׋j_A+G9%!Lt;\?U)&'%-ȻtweT\{N%K)mç2hģ&GPJ?ن`#Mr p֔y?JD.m |&^d5)ʹlس5 v}eTMxlӕwVU 쐈B~=A>nsUC\7 @˳- \P[>;/3g .sݤk:=LS. chU s9Ys H;K'Q }nC e*|w{/V͍F9*zoI0LG ScS@}vT`U)l7PJrɸE">BRQ A[Sa;Uv*a G}10s)B,IQ isZ@k \#NzKfJu41AWZC2ɷ@D#,he`+IH6Ecn)ff%5aP-8)4QCKgimÌĂ!/n'6bmBܪtnN5hVzpbbF׊ftmX9ȀPBj5;@Bwa!']۰QS`PPb #PI׈FA}/%Q"Y(zjH᤻W ]_ĥ.yujwc|rz{8R"F[\/"T>m^ygPdVyYEM fty3nD%~< k*g}2wИRzڲ{.pCx pA9{6u[U\M)f!2뙅2"zmU4H)uQ\W#CIN?ܩtEu vʙMp|i|\|f~):sK:UC*X'iT1& #\2YtJʯ!|DK=N6MhsZ@)P§רepiP] (B stҌuT! EV8:zFސ%OeSRB)ނ"NVlT F'7md$W1&"  /lR?Jyk$h HTGsG`;b b.#94#&,FA^+,folcvz>_Y lt~kW& {Ba7 qjZ!ϰ1ɽzqSiY}L8X #s~ё6.Nl$Y0R*Js$Ni!0 4?6pޢ`l L` Qrf,RdH*AEH;G']5j`jvtLMPA:Ai!zk5OZg"H&8eki`+4H:|׷[TNJ5A' ~!H,B ][0CG;O0ms lq)j tijaBj#"t"c( )FC(?Tclzr5ˑ$eH8~Q+H<-EL KVO _-RR@? LӜZ&6ވg e7sMo"y\,A3mODK59125ZF62YQ# BVc kHgUY)R8kZRGޔF`vF|޺Љ_m&~ %p&% O.%m_lؠ@ ;Y/Qk@n![596*#Z0c+`6yhaTCEt')q_̴c4Z (9F NEk1D. *)Wy9e-s}b 04D1D1{sSܴAzC_ `6OpӖtyiƯpB<ǍH pg 1Bɧ e4A㝣F-ej'QSRHx td&˓$RV谑&S|gsب`^ƽ'i9J~-:xQ]q"tq4rg/&!w8ٜTw/[p̬n̘*yX Ĺq:lGoh Q1[+-+-5wV`bWJQlj(*Pbɦ(2r]r: w( cU)jvU\IOR IppYWED)י?n4 -\Cz0z烏ّ4ĕâu ske}gן}T[uʎh1[?.-N m#hS*JG8VilR^X&i&aP$F6d;eEV_gv2YoKW?pq~p‡8?p_??Ϳ~Q~j4hWXr7b/ROs:3lYa;L(X~Uu:$< \_ !&Cq3zG&)bGa4XvxHvuI;(4_A ̇ѝ4փ,r$.Yn@`E,0OVmW6f~= XM̻ E m4O?Tu9hH澥}2#oC 4 xIG믏~OG7O"L6cMI")h#X: c(= `M}B_8`ypF5?fy .? '?4;P?ӓwx?5pO?,LyhOX/`Oz4?_ZG7viy^v D]Y)t xÝwjwU} -QAo]"y5xO]7Ϯ' 9a7E.Gn+ y`m<,M( 裾?VUFUy]r|GyzQ7ݛK&G}&g1xTw'U:DXz֑ @~/.|O|G΄nZ_ҁYVd;wg4j"qĢ)ɻ~=;~] v:/X-;"/魒,?;W%1!ww: K e[7 ?lC't>Fb=n:%-vيIs 6bY;-W jfh \7T'LnK<y[[Ґr VSÒzp([`zd ¶d1Tu{n2vZ"Ro%m9mTY${ ̮ԁy21$hۉ@ !oSo06+bWQ(FEkw`5T{!̈b3rҚ}l%\GqbE4 %Xb{Ѯh \=eZg7jp3Hp-IFzT}71aa,‚$@REA|Ξ(g8ow o2YemҔDXMfrp <a$EU<4=Ɵ)^fk$R4W 蓃lQZC:}XL0.D鈛Y"zB3+)%>xjv_ϪzA sؒV; :B9J"qAYbLeG {g\Zvu};"7Lu;t! 4 D"a_JT̸Tzj2K{ӗ"깱`7GqTøU%!q{D-D)a0xboBE5Ov97`래$^`n˺"yKv_ RP2X"qB9Xx%]w .vGg Бų3:ܭv{۶jZڊ12$,q3DN'sOM1яh>}Wwim@B~;sCOqzRb6hOS>1F7[:&RO*HFҠHRda;K(+060Ǖ֚KbO%qM0 #1[X%=$?e:%%ivy*u<3ŐLhsQ>rsoc6g) G\nm.AfR^^\r p1 p 훏.1u̐^p20}xqYv\0{+rS)mO/t̉Y )DQ)lҜdeᜆrps|ുv(O~g N }gy~im$"]J a3< JsmQRԡ57㖴u3yptüRUw]].)Rfp05Brx R7 W Lk@v@QlwF㴖,C FS|[V2ӳ;ˣ_=m6͢.]9Jm$'m2 2*xW}Ingd|FdyVy1q{&$+G]MMb&qÖ$\ɒ0y)$mHLo! guG5[%As])T1Kr !փ!20_Rr24yj6xdg8w \Na[F &r dJ{ udH874/Plr8W+L*VGY o'S(86& 88IJM^rIt-AFQ}s_UDm$ňb ٺQFY29B@.,@DףInSb&Iy/I 7i Ԙ;朲~,>N?WUEVB E .R(Gvi'͠q1Jk2߭6wIx~g-2qa/Lja\9>O}tB+~,-IҊ}u4le?U^g5l/0!:/r ;{gRDzDI} AD[0_|6pOP$zŀ;b}!io&"ԂS9yEH32u!C^o_ &{^ p>  9KրwAۢudtLl ? ;W`y#Kž֧ ~IZ8e&J1Od\IuApxZHi3b!K-d HI k < '#S"g3UYd)\Iv88ĂaWZ5xZ恇Ujzu+qXִ|n c,Ѭraҳ}2W920J*PƑcep!=Y7FԽz:n5iGU%xa)m$O,y p L#//:Ko}ů R#iHe4oW{b-Sdq d眱;Pz[-FBS,…) MU" E0.+7s?ڧm5H߼y aQjqsTem /YXpuS9/cNLAgn ۾K[q箃!ФIsuyQ`…D^(ΠEPamǙ?3ѱhrh)& ,ZeDBpBӽ y ="gY{8+:Լ!7V&FݤꀘL[".%-)RTHHry&~ J.g@tjj-\e_PmQBc( j>߃BL<䄉p %ߓ i0~>p6V_f 5vGn'`W@3Ct% &9r"6AnP-`{$@ܠ׋/%F T663 )Ooo},[ap/5ىgm$_b$:d0hw y$$6< ,tbHfgJ~գN\v.nF@IdE`H|q=V<!٠ ݌N6Y3u<|gNPL)RptHN$ΓڠZ`Ϥݙ_Opڹ 4E5dt^:6K^櫖Z~MfYgPsј3Rk4D.4zRgPK_UG\.U)i~Io:yIUGș鈟V=n&0ø77u=q~~Xc%5=/(UOGf~ܺI޾g8XuW5{l/AVEw-&kq2L}wgO;UWq<5%[ޟTvRuAq:E}4hu^W&K8sN@P`v\g$L;யy7̼?YG ח&/gfi,oge&6˫o( 'elzAIse9 tˬi™L!z%SqFi#μ(0dWESq@F̃8ɲ{٠x]t՛^zetBn;\x ۡ ݁#B4qU! Oh!/@r:bb4O T0RL-%*P!IRE&;56b-1!\ki0sR̥;mξw!`z%[."7f{ugbޅqm3"&uQ&3!(!w F10L{HR'M"=E- A+M4 gqr)I:˂6qZ'>[@Qbgb.I!e$$$IcDC}jaKYZH)JĀT)NjOA,NN MpSLXa)R`|HNǚud֧YHfKH-lR9-' A%pcyb/Qcd <Η` ro8(1-l2#X'hcP`[ᨩjQ &wpF_ ;[IS:yv=}} '삭17>0_>D{1۞Q [AaA3 ccYkh@ՒuZk nעݩ^'zK*(MyKZol7"R{95a uxංtEX(xDjk[ʤސ#¬L4ܔ7]ngP.K>-Qm^6l^Foz']>ҚpR,Яh'o_FsiDz']E_ќrMͲ7p,v~pTœ/m~d&;$χ#>)O$1s;xGͲfdiN몦70''&~ցw6ٹ2؆޻X c*Cr4_$՝zG19NS%w^ dSvZwtV#V?M#xi6sCs#nH)nēSBEKmqVkqLxt=.X9nE#7\}~ak9seX gp:ӑƝ;\Tܛ!p%7OΓI5 V- -\~c!Ζ58f )&޼!!+OrΛ׽ƳV!N=."J'iӻOQ}ȯaGf/PmCBEL1vCQn]iP":]FGjlݺ,nېr])pdQr;cm{sRdm (!/ô vz֣όas|j+-YaMo ׫`4_=gelXN'JrSuMo@rA>JOh:F6pb W*3J%EcR+Hb'vVB-?]_-O]vWT¶^fsqg1nVtbFN Nyv4-e,S $̦PW%9B`/h$u5etZUO*'[ݯ,&l_~+,,,+x?6#.b&~[ fiǧ6EY*+ 'day ٟOXlsV5u i~OgVqǡXorجр։)L)]X.ɖǍZr?#3$?=rsՉrlyaw0[v#Q`'r6ʓgDQ:+" H' F;$aX2Xbt':P_{EXg 5v2L>*Ya]2yGȣb@릯}/~370)ǗxMj7#^$}#dsZ_,vÏ D_z6.gvaRWW G-ɃAf ~2#IE2l7Ɖ$'pERP+bGmyW9_.{́ű]_7//fӼ\6 _d -M)پK bwAD U|LJK{!5䈑Rh#mH@͞n˫wbA[6u]'zυ~%t݃]Hd?e6˰'~ V a}.hi6a9:".zڣq{\siF_n&~  OA9GG5X%^Bi~$UO,l.Hl''U&cJVX*G^ , m~9j+ }z* 1M0I9T91h_96A܊hM3 gPsL@|O>IU>}4g'x/vܶ a {;% ixU~aӏ)eG4N'.Ex; ^tA4Z-8Q*)x؂.hȸ8]J Iw>%"9Ә _ĹwHG`c.}3F2weM~WByCT&mtL! A+XõB.aep 4o׼ 3=rLEޫi4bɩv̛e:,&r(F4W gFΌ*%',C"+Bī]Rgab=op:οP6;Ǜpq(NP&@#@L_/Ek&Qkkj#l臉6aGOҾ|E.h qܧ?* jV_B$' x_9ެ7kmb,7 C.(l4異*̚Y3Ed)]Y d%yʣEZ8W6d,bt-. 4e}(&M?62hmd`]>_+%eV2#u7@eD, 8~fLJ}&8XJ)wLX_*+c,=X@}(c6BxvZHC5ht>5_4.|~$E4{3Dgk_V+S'4փIV=_D jao`e>KK.3q1.'*(QJJ|`I8hbol[2> }ƙ#)hۛ*H1̈EfKH,FB(]+\ᯧ!QKټ5$mثBj\B3 DLPX%M} ZLRP!9gl4QX׺(ѥh-\ i *nٚ zTEj{"H,ā1mOsJ2 l2DBKP8qN1 9B>S_fivSR(bhZ*,' lTJ}i㨭:p.' C'p:4#U.b6@(ABK'G})NIPv>WRGLzN*Lxg!iW8hrE;,jD8%yQ}RRS弥, <(/:=@僐 %D( & @f78OkGm$h\}D<0ءO0.J˒r+Y(02%p-*uv=hV{{lK:xW>e:t$v)WNB{:(@Y=HAì0*N;Dm)RK);cEg1 N?Ϭ Jg% ŜQZr6Ǐ-z#;}O+b+9\b?+F:ܾ];}ZOγ@=د+mWԧ+W{_IjLrmbnPɣ Pq7]f}(¢%T< p4fsO6oGcy$#Rc_"Zg; э" 0D744an;DR! 9vp\a7 P?!BZRw؋Nvݲ[tw]~DRUߤ%fy{bK]"1MJ[JR [j^H^AݝvRjn-{/l^Z+oRzR.{+71zZz)"w BwYp9E$ԥ"9NbPD·A&Mr32H ؒqݬMiVl-l+ bKb䲋 dl*뜭(9l Bea *!ꛡnϡ ~wkIj(l ;Iօ9Cwj}[:ҩ)\t,W|V!^FdL!r QrvGTiDB2)ۣr6{lϸ1FwG+c푳R%);\kKd<~hJ:~Өp y_s1{/ǿ(S;3]B`AC;5i{[wz m >1GwwcV܏viZCvȟ-g@!%Wܚrn ;D1?@G$^MTy{D}psv4mktdY vpv t}|;y-w_Q߆fbg4v=V7qMGʭme?}q3~|MOJL|ZbehxGGWϟ@?-0ˏ-FK+'GrGnl{VzQD.&!ʲ< ^bL]ZH/zyJi[`k^վzFBkOjs/xyQ*e; iIHDZWy<5?u; NW\{յ2~Lp59NNeeQgJ{۴pzXafзr?^qᷥC\*oW~n@#~<׾Ehxpn9jXRN5`O%B]wWo%KP\ l|yy4'HY}qSwo b~. p$]guKU1Fɒ<%MVhlkƘ]Ck〕."@y. T!oMfՓǚzkY\~n@gAE}iO2Ng/%o%8Re铘C\'& MB >b|.N@-);3קJed2'&X@(c6Bq c";uUdL>Y)ױEiBKEyJËpGr;bE̊DJ,BHI1/} AE"6Tr࠙xvl7j:Dۉke󮾞i~я7VP0);`^)Źs$PRXQ 6\o! C-Δ!ljn-㉭ȓZ!PzJx x]I7{Gՠ$D[ cd) gN_dYd"( dv2EFhSg8 +1q]%zu}bP=mMA>~Kh-: }NJJ ))"2!P`"R zGzwCfG9k)qϡ}204e$沗 Dj{CÝE|E`7[H.\k$NL+zsȬt$"PJq7W-΂fhIoKX׿/HgH4 V{îGE3.DE1]UJ>^%ҨP@\ 渻p(ˋ9͒@OnJLdO&@ {Թ}13~jB vDʐ + R Gc7-u2hm?G@4.|=`2"FLq5, X˻@Ɨ}P2QrFWl/5Q6 (72kBx<L*Vnp9PJ+L`I8hbolk6ۍt)RcB6>7U(T,2[Bb1j-BȍN@6uIgH3ПGJeVPdy VwiPr{0/҇Jݤ/IQ*# XSgAhAH.p*'akc(^lW"ySu嵸Ͼ0l_QU'"@ɪu`%LG}(OiÚOYex=k ,75mF)cK. MkA[ڑ8zt.s#y9飨x(%6"*pd63ÆJ>Z:zd;ήI[{};ڧ!㳘b43t@L;2e'IMw&|HI^P*D݌YV3RU-JT۫ 6DwvvaM??2MNeRY'` Ec VMtQ5U-ֻTy变$^Y} }LI`2ZڰLy2rI FְKJ#"9\/'9PPqX(},,*]P;)Sϗ\0MgJ"s=L *#YVD-rujqg$0d 9]l\# (J3 Ɉ(e}nl+fFQYgAP-M<;c)O޵_Kc6F)/AD{\:$"~K-LAǥo28lFd.z4(+ y҇g}qgBcA .(+JJT|G3`ofd`KV_Xߙ) ..*}1"bdPnLA #MKł" b '&} 0clJY` "d@Ģm2oJ*brBN)}+W`g2d; ڪ DɥN6i Y 1-K[(>ګ\>1Z@Ad0D%QVJczX5(s@KjM:$+Ū_a3x"VUI~PX5Sիn2/_Z܅j ӿp $qKkN6Np$zoFQt8O,ېw,e_?-,QϒeuQjk`.L)El)M@exors|()-@Pa2W`3•[GuiT:}D`z`U4 qXt) GaṠ&I\?qmpS8_u6*a9A 1/_"k"u u5*,ͤD çBX/l{sal-V 7NP+}7p'G'*}'UX%>.>dim.7F!df)OG6gxLz)n@p2p>S$ = M6X翟Yw'i]+&oMnΤ("dV"F#xX'L UQ'"?_OTI1QH:ml+UhkP̧4ªc#C DSL<"F-"ҶT5sxmЖ%zV-'tD2rLrD-N|/`k0TA?P3\|Rz`V j~\욫 <:W!m#_ݯN`=xɦ7g?~񛡙7U3^Rhϧ n HPLu߰.tFda<芎[bӔbEvRߞ.f"Mbբ㒌XiU'|>&$4`[BU².%T &I.Tg 9x||2;0-K٩b3|K +d|gXE`2ǣ7w7~x+A^ ^tBgٱ|ք3!g~w3X)zSesz6QFr6Y˩,̄1R r Z3e;i=屗%| XGgi_G~Z$Puj{xIp oIኞꂾvoySX|w:Q~pO6B$%aSx$7&ބiFv-_+O= !^<ٮ3?rnʳ6{ٷ'gi<UZ")X%=)3ġ9q%҅➳"d=bg_.wNqkXZVϯ{J"錧';~P*iA xQN2˰bQ 05r`o ^TeIhBs^:>Y 6 [@8Ep1Nq*j9SXh ZK*[纈rfi|Xϣ ^\$0gQئ8$>Z:ʳN*>kRq.%kJ'fA>@S,!)Q( >ry1^19k^qּK`yr@&Rxi{{mLd $t>W#Rnq\ոJٵ@os7dcp Ͼyx+Y ߮k `8Lq\31TWjAoPngׁ *؅Xf }X8B_ &.UX6II"ZИۓgk yu*QvTR!e0Qq֢˔,gP&\8䱰HsC"0XpbE]+Rjٺ(\d+E8MCQpLI/51X LH%A Q-ܬ~j͚D6Mpc(&*E9٠EVQu#(ਤOM<( <ً;ե֍-4=Yc`!X#kѴscR=}v8X]7yPF*[_(*k&XMzGAEEe*f[ F & ٝoۢCrq7lmc⊒ Og>HG¨{q صhZ*{lJ(1q&tG9o 6J$I!G>#"6%0xNy'4~2 % /|4U,O`.Lwd;+ 6`H8fy3r-*;s Ϛ }E XDNɢ]svw)6fl2S_|K`P(q9Yq@F}؃|WT73yi߯_I qi㉳S){ `lSRQsě szhʴ U=kQ D 7(k.ÝsipB2X{dqb0F)b%L9i]M?Ѵ^[%lp;j{y]{SKb#t/}LBc>5hhCA`UVeG<d$E*ؒ@UdUVS \]$h6#^nIw_9A)gm@VY%<89Y˰ M "׹!(גF`*]``W"fs>)dl.YЏR4u6WQ sl63[1ilđ{vm 7.x?D`i#$Z+42h^Q /nM0"}0(aQcq(#BhQL@\@IcN5b&-| {Ib@+9\na+$1U粨ߢV@`Q[?ZI Z؀AK*A`Td5hS]+ {ZYAbU}0e(t,MZ 8D"Y<@NQK]`ND9e)8NcN%՝y0o?5">@Xm"җ k%CNshUG#  Us ]/L!Cjc#G>HLUNKW:-k8KZp%:_V,-v>C tջ5\c7ϑn9"aVuc^ H4[5ijZKzdi ҈RN殯OdQpБ0}:.v?bW܉WlS r=}ua^+b(jέl I:?Ijg;~Gg<v f8u!*7Sj5NV:5G6ހj%脤 u"rF~jQUگi{^l &㏽EFs۠D?ś_dz_ kjk;~9P&^2hmc ,ٍÀ29A񍦟1<..܍Dم8"sŋϔ \EpGWyg}MtC p o=靟X9hIBsm$S_n3jNϨ}O9;vkl0Q!!߹6RK(%?vkA4*QrWڭ`Dֆ|"!S9S0f-WKyþ*; nF=ƦlGa'~Z(h{rL$|dW"ރjcyX{rrU2ezKO?fonvۛ-&SRy'(nhF}?Q탷 ʉOӨ !& 6-_viTFon8 !֬DӅpj0B9OhHÞ}[D`ਰ"?9m[$=ѧ)LZBKlAc]V6jp{ ) ɣu/J1$ #g>t!TK&[A`,j2NռT5Qa3& b^~jY*(I)go#EH!7T*'%5@E)^~jpK_4)%8j%$EH!﷨V5pR6&>uiy94)-f{[RzR)%>WH)!IiA5ZJKCIS|.0TsTJ%J5J)iRuӹ]zM9UJZJ)NR*//=L& L^ME)KQ}I5ïʥ4)EayReTsZrR_;>gX8-ZѢbhUpsYRUf,on/>8%}}._=x|y3f4r  q+B!k6sugcǃ/TEؖ,K|N`Mtn&\pAk ξJ9`+ޘ g)DT !mVkiY>C8'Re[74ɿ=Ԅ[4UJ{Yܺߴ"rqE ݉ #G$yY&rPBL$ҙÂ'+8o Jo<ԑڱxG_]ԖGa*jݎZm֎݈=#\3a8@u2E(9 f?1!3ńrkӹl;ϩ )θ\.;퍎!(Q15p LV>⥌q2& ˔C**v ܵTX<+hi_*yqbt ݰJCkmhkbob!ˇ:X9rkG'a8}F9K㣱')Ƭ4ʸrDqI;]^x|uiϠvG1Z ۂ?Ob8Y?aDޞ.=3ͰHq~23(֜vΌ ;32CGL+e2/6$ӆ#/ #w3d핵/)Bjͳ Hi!ΝBenk-W'޽7y2L'%+dⷎ:Uit G0VWɥb28mт $3`; qԮkMؤmR&Dz`IyֲK:Z}6i8o̦"m8ʳ>z8q vR piT!C(3j; xW%GckFј[ o2e\es>eeOg9L_EJ@@¯W^)T/9:n~'~#~/,`Po~6#4{9kb4Ml{6 Ɵ.W@A@p"b,F8O S?NK_U 5Dk/Km<9Qn#9 ( T<Ě~,!f-,+5p.򗍰#ɹ" ²b %c䲼^#ϱ,Y 5. kܧt<Dt]OC7 8'tWS'tg06%)ʜbt4ye?M \1G69Ȃ 9 d8H.}2K0!d,4&8f\ɀSi1 S݈|,Fha!OhҊ  bǓ}+*/ Gas!40k,gJDrkH$b (Px>+ŪT 1?A9cչ2!OP`)dES4 wSJ㑔|CQXV _g׽ 1,İlòA9,+[*ą&X+Mm`1%% [\ 1ʀ;QpWϸ+!(B4A})m&e'Fd L [ */5 jmO1Y~j bWj@P_F`&Yˤ*<| >oMXWК]jIl#tŃt\"XD@F+4,|K!;B4B#ImΆD JGU k[" c[1 'DcP a];6yqiҖ.D݂}}&Bʩ/]jJEC|jEB' nZĐa:B2]/4A/ ~_\R0=:F)6VMQ݀j[s ]H}L17?0c78wvGׯKMb{EXäZ; QK$y{Tj]D]SGQ<g4qI*H `V1`>D 2+3VA71M¸TҁCQHE9ֹ(A[f"g1B=ioGn}]dl|q`p-28ށjRT$)Y`[T^;<| aH^TsCoo-fsHĺ5ƿ3aSnk ad^SJn*Pw֭*O2Zd[of#1]S(#.D7 =9cHV)D 1q`L_k-u!̮Rlן_rke[ 8%5 )>Y<]j TY,?)oefb & [)Ꮤ 8p6 V/OI} oS/10t~qѰX@lEutl;mCQn=84aIJ7<ǖJb4kG)'AAɺ$6 X~)x"L+n3"WJJqHr`I"K"ID[{RnwltdeY lUnR}:^RRLkh(HXԌGqT9|>{X%n3σ((,ׯov'#V o^]|ac LI!DfRhyF^s8]^&U0l/@DeYET!d2cW>ޞ f44/@L" :W8p̄^Ͼd~8FҺ d{'3o_dZaGb=j'q~S2{DwLD'k҂*/wax5„[0ec\4߁mh<ơ+2@2AQr'RnbpOQ|TqOLNS񦅥pMKsWz#X"r#@E;OV0\׳>h+ف@kq58Iʽ +:<9 A?Ҋ}k\ =Owq|uO棄rtJ95prVNOqW6U&+7@! gOI#YQ yb(-p1V[NQJ "7k9z7 `oJK-V$)[2IV;ND4+x4i^3ܾ&=eA&W\/hpj\Nqloq\V'c6ȄK/ͮ#KXfP,J <-kh@#裸\x\ǔui,f, fj ? .d1}IGq{:p;~ɱԞSΖ 4bF{δ"cL3xP>"1)HJs^hF-;Š8Xq(*!E72[h^:e y  8M79:HӔGyϮ/7wEI;Lt,8<"3s)ݭ^v9>i_F qz$r]7cH&9*D?ђC,F+GaQe \ٱtW, ZɳF 59f DVUjj4A)^QAr(W+.Vo~S;`a_H _9=-dls7O=lAPMd}Ȋw'g ThڳM4Gl喼K[}"x+#S`0<=T 3IrM۳OϋcfWErSp=aO@Š_?^sO>~b8p3~$__,s{Wf:Kc%ҭN?ߜ3}:T],],]_ȃ%[MOE^2&P,8.BWZ2Ij+]h^~]']W 9) .BOCTyH]aC}AANvUEbqA}e0Dhe }"@͈r:jC)cA3*&S!l7M^|+ovFݕo܆1L%ݗ"sU")EĉMI0>VAHkPC/ZyG8Up(9Wms)LsdH{|('|!?ˤ6 5kpIZ2 ۥZŤëD^H4ksz;SV(Zwݽr9ZEnv@7Rep8;v}k]:}{5!EzaJmO> e$?꽋WEa,X4 km>cwK݋j5::Jd$VGT"(kOh7Iȁ&eT8IPA.Kw,ܗw -%E/,遟/\FXHy7`9"CyKryq3Ɗ%]?w~'zQPFgq8Ζx5/Nj3̄V4uZ}{şg%hrjzk sv}*k ^o,'p\UsgY|p1H08{xu; kkيD)[FSv}k͞rvnt .UH4P{Pkë@B(R*~ۛX*.t|͆J4K"E;: {<_*v^mo*M(t+{?}$mg3֗kͨIsK0y13uhVњ1T84C}.o .:b4Gs9w `w }졊ٕ]{Ĭj7hZ976N'R/=Ʌ _kXfr1<ڝ`&^gax乕%f+V V{|#D<.:` /jOW& ӧYJ!}:]-1â@rWEH+3K-߶;Uz]p^k۸Kҩ&Z3[Q5Fo.10a7|Y|s7 aD3}ZQK`8pj_P-PM8 0Cou.uMjÈpA(c`K7 ֈTxl0q.|B#B C7;`lMcl׭|6ߵyRg Kok} q_Y-kzLn\t GcJr=ksF/wRyJrn+S\cLI AU)^\O'xz+m=hf 7^J4O%VX@Y-fѧoYA!$G[Z0(G#7qo0l>+W|ڐo\Ddqgv NA DtbQGgwQi>ڭ E4J>ysGOA DtbQGE0ʻݢ nmH7.[2آh鶓aqH^ތKd0|*>L$O_6\'_.yec46өt{$C8ѥ&#"8n#YJºoRCr!ŵ=E2PlI+D)fvP-(wfX[G[}3Q=_][+8msQ:TLꎋJ ZP_ZWSsh*^MIYV۹7 /YVmjWo5I&Es>jIkњq0ݥ1ԆI]gW5eҐȱm9#ffJdR *fAAHx*u"SMEgx'0چ|"%SM[vSzcbi":﨣zckVȁڭ E4J[th j7Yr]n4wn9=(][ЅPւ|"%SBr52-P[d@ۈ'q7;PN^p9zE&uEPrRzm ~-aEa}0)-!YJ_uL뼔J '!BMGHd ڈq)` ddovN@;ٺ$yn@F:*!ch3jnI„1Bo:V#”Lp?@YU۱jr>\ ~Ebל _zї1n`n'WOmWnig?&'[vW<ޔVYSN2p8ujpy_ڇͤ!Vϯِ\H1|{m kN˖Pig_18ZmZ̽=R~ͻfPe)闾XWuQ,)'Hvǒ(TTKuNi%#tBK%-*.C!Nf;1D1E>I$*آɜ'-Ő>Ovॾk3\@g֌)>!KAhlz"c.9YS>aG $;8E,7 A?̣Q4AMLhF ‰{sY;߽n|m$nk[;Sg+X,K&"1@jI,*Jns-FԠzsujA Cew$salzg$DtǶf0n3AWI;ۀTWrw[[<ҔGuDҜvv6AZch?O=dJlAd~R̞٣m|0#&-| ]B\.@~ߜ%F8DID ."APTrcs$UH˲\2|҉tΈ4rsv%WL[nSIb˜4#!55JI_$eLPY,FlE:q|ۯ|Sʨ)-.ՃqqzJm|TA_~ W?a7o~~QZH~3͋z{4&09$tXV%g7w;͗lzXZ^_O7 Wq2m;KJx=U!FMrֿ=+h)fJ>[XṔ&^gF/͝ɢ h)sjOe]H%_LlH#.}lڞII<0SeCbܵ\QORFgHr4"Fwj9"_ZwXaܨSq.zI!AzIܽ2s|!^lW=MOj'-Vwj0ZTԧvhԝt4}b9q@l$o2Z[I;f/l$(KsE zOC)koI}Nىh@B ɴ^KZ˝Z6.9r!!Zp;LLZK]~Sk*;+.Ð i"k:@%u,sIj)4͵DfI,4%UEshZjGW[Qc֡~_ll[{)UcP$S{QoDlL8Zj']T7:Gm8(˜n/ނWKE"YsJS!Ϝ$zf:jG {ޱJGf{*f,^Ȯg%m Uvlxos*F|S|&<j~ !b|.|SaW0vY?-FJȃzOaStq@KPѷ#Mn)4DvLIK3 mSiH%hEd:`&U)2 H`!!,ی\HjAC ^$ w* LTSaD*EmrXJRd9&[1+#EđQKYi6[uMs~UW)Jd,wxhn-0M fJ Ns, t&9HO@(3,Ij4 #͕֪/^[FW{[##-8uFKmRr#K5d{9_g-=o$xnY=[$lޘ-6.mFAFFr>ZWTg6"c+v=h :B$%6Qu0J>WQ&ad'fEv i#[/"'psdYkq#"G6kVW,ŽeUY}UeW$[a|%Y-nՆ.Vwj0Z Q2։|k]\),[kK<;=INӉ)~4"́`Oz4TգE;1P늡.'kv#lD6 c͡fJjxjګ=>l+#vl'9 -ـ {CPY+hPǯIs%Ht]Y35՟/g{Qå|̦άs>ݗF34dHAdȲ<%⡟0?ɍ>NQ ><'w%O׋("c\¶hݘMzJzEe/?Mܗ֠kQ|躚74vYmqHJݪfK@ԺL.=0> ,w_,94F.KTݞQЄT + yC$aJ>X Y!)2qݍSCK; i&}+wW!c)_Bū$5/;Q^6)ۧ$!9UU/-r%T<tW3'KMP =F=W|R xRź+mo ׼=:3+v0.`yi;R3fbїO}fsMPolnޫAC )]kwn6160Ⱛ71$熝^{\,ᤚ64 B iz /m;—A;*m-J# |ȚDФdn>Xݹ(WJ}="NL Ewţ&(_}Ssh* BA?¹K&qd q$nv ֶCh(5i8j5َZb+'khRj\1P H5>HǀR*J'0[PF[LkY"4X(Z2,JsU߯s$dFHN}ErE%(A [nqyu[JMDBPe@qAɭJ mw$ <&a8~ •hm#_z}e1VS~G {^.bxY1'%@ F ZἸgx=d~@_xp8t;fay~7}LA>.巋"Ct%e]jO n&O^{b?䜿L#?n I<˗({]-^"ó!ro] |[{O(L˫K.}r_UZ'ՀwA|vHAQD;g2.B$SP%s ,5Q jnI|>1[6P@- =+ Rn5'نZYJ_R& &$7P(9"g)}RwmHW hl /5v~4vk,4R _2ʺɼc`ЭNe#9~(EpYLiZm?g^4JC.+ 9 i5QdF%rVcUABc&4pt^a9Jxa$)80#~M|"R3;`q|I;܈;bBam{N~_~^X~v6 sۧeKE<߭wvglJ(dJiK 2qSh#r2d.+L"H<$)7E3!>JW )8s+A $lI_4*z).a𜈋j" !s5\Rԯ3*!VhAMuZGT)O2r1/s-}6+컀ئ`Og3ASuC4Nf:6VϳKugJ$A%LJ+D`z]oI?~K^s+\SJf@:H]Isb\sV8% ж~;ci1EO°G;;]ɰ/n7Lm$F68Z<@1 nf8q[vP>lFa5CZnl\[FT}(x- 뫉8ϱ8@Kqz*́@ n8:1V &0 ɴha⵼=b{۹}`!.k$jÜfS9wb`9fjU۝*.Ƨ]EExb[m=>dN2s}뿺kSf?C]5W=}1N6ub5ڽЉFHv18\v?Xݺ@;hHLЉFHvqcc*iQ3uw' IRfMfsR:8}+×n mmzH_>~=yMr3swϮ봛{}W}?cMjb0]Xf{JFII?VVKgwRlh\ŵ-͊FSNFސ% !DAQ%a1 !߰5(]4AָيGN]$hx}4h{QJfIuIlJ"|5fWgyxB"m&8!WMP+j9\{W\`j̭'L5v<-8bVL 4/,W:ą&t S̫1Re6u 륝^rݹk;3ƅT] K LjLRp*sM yiYDISEFZFZ&҂YUO})ftl{hݺQsUxd" \Cۍ`\g9jeG:@l]6|v*=_nVHf?B8㜊aN$wrw2hc;/|'zNhV( C=֜.˟u= p-hZ 9BA1[~.W evM'fčv[tz@r@03 [NBb RmSwIۛb=,v?~-nEBVƦez1ྺwCeCռ&_d>~xqyTX<=߭7FٍyƂX0sq̀3mA3CdSS4?.m.ݽF6cx"tCDWPwl 쏪})jWV\;۶Vl/n'ݹ#5ۉf 7,q^,8v.IF,g,KU?s)*kՙh?=7Hu:u([\`h$)ɢ0R *C;M[S%X"Fe"5/\V3H)lFk"urϭG鈢D5篷?j"_NN{k,S VO߶kXo~XPa9J&DLO9$IQ7tű;U-C$Ǔmĕųa->󿯒ReK%_DT2(ARR.|Q7n|,mm2(^:zCd̤Psj%EHco7NG] LʁN>46Rhl<%%_e]'v9/ńP(MS6Sؓt~ X0 ;X56 ΀ ,.t)- + 2KNef0,M"$;an(y]7!g}DýMDDen@SVfZRQJH~su%7EZp"`ǔ:(7ݽC뙓?ca=ּ}S׋G;(}]&we|1v6[Li&[C%mD(QR0HR^hUdi!ENh2J%5& t m=WjF˽ ?{\@ro0BOc^ՌX@QcQ*/Y*tT#"+mR>SD)b DJĸQr.9 hr0,蘯0HU R֬!XNT.ާVdv?aM$ LxQ< {ZJc+lG =0' ޣ2HknSة'h0m>Z`<l=y&:؊Lz8uӃO- yi{so<3_y2"Vm_$gpQ3?,~J@m}@!(= DV:x mk˜d576*\9MW}(a;ā>ftce-PK=%쿰`ЏB?*ĖѧvZ[͑k~.iZ- iLeQ>خ/ڶ-^Ɂ*&LJQ$et.آ)r[W]-S%UcOc3/^7~cckе{y~_K$]yy=y,#$1s2 3=x&M>vs'C,k'f xԙ ua$Iq)t,nvc\s»`)<7[1E)?|,S>ԣ$UUhA%fjS} ۏ=tIT0?VV eJ/R~(XQz(U>VLt&[ҰmYddwrp9jlSFiɍH\NmM j(tb*8!*F8qrTMPpAi߄n=eOEH3c.``wZ6cuZ6LD<_lN?/>[OԇgGɶol"{ȫΌݧvҫ->&2" uJj֦9MiѦh:4i4U#z WFL*_K2 [Mg e߈OzFQc6r<<7ڽ"xo?ƛѿf7 K"\luWeg_So)v%ԫ66$AEUdsiE\+֥e%K|_.zt#=k׏(X+=&ˤ]Ŕ{؟F\=8mE!$)oJV2nʧ Di%{-gEN&!azJPz8$bOh6BӨ<~CA.Ybn^捱U`rM `&H'UnJ :d?̘joHb'8K1KX" #Rs?BrڙJ'=K 4I*ab>?o?j//EB⇴ ]]\$]O߸ +$__?+5X?1/ y3_;ǧZ?>.0/&O? bs;>K6#׳ i 9*W܊H!K CPڌ75{HBݕS}61ZzL$tf4r%U^w@Uݥ]I9뻨MHp %#~F1k+caKޯ%Lo ӫ8Gi086az%F+ 55 ʘh .{Xnӻ/oNXPO[ EVl)[M{zVa KK{L7E%k̢n]Bm @KLxzS*tאc_ !`=j;P:4]Εb$gs ";:|wڒH=%tRWqVʕ׷-QX!RڗFƸ^pe̾{_o)ˇ$s{l|4•ޥ8$UEV6S kF&WvZ}oJюJt⍓ פ:.85ЈdRpCfȁ7 [,i%f0rnL̰4$K) 'K6w1p`@pToʘDF12l!cZIB7H-O]ܥIfSŝPĬSa0*ȥ`Y!OIC&A$ n fՊ"3/o"J`EY G.υuY(urhg!3ƏQ,XQ-|_OߊY890ǘS@(LpHL*E,"Xάj"K3m0h"qI@Y5pݕM maJ؝M) C8>{i%Sn$G=C o$0$p hzyf!(MJ\G@w3^ 1 &߮l `oH:^K<~ =y(Yo-76jȾ/o?][]PK4ta}LJ"]z4 X[S ~ l&1[(6bW~D$t!I_$$Yn׸K˭n5(~NO>9U|o=cۼX){_7vW%|xwY)FGͯ,ǠAg&Ђekp=Fȑ:&3Bcio-Iq :0÷WJ9|Yu 0_3KtuZ-FKiu4F31W'VŪHLy8W끠k=W7Sp@5Oya}bo;jB?@O-9:}۾QBN>BGM;nlGK-Q2W:oPR%M7(sp865魘V\XUUj%>/G(~N'tc,YcWt9 ?e["+HB0!uvQאLAwQh\{CO̩^XKRx,7|=hRfe/~.6)!Fmnt4i:pL)f$#g6,䓛h#:?skv뀳[S bL;nC"$S٭yAֆ|rF.(yuE'9}w'zfi׺,j^zHחg&7D!L|J)~<^;ԥԐ)ZH0ӥ/Xih(Xa6rӶRDY)b.1 +=$UUjKiv!a$J#RK.&fӶҸܡ.{8J7m㬔xfVJY [)b4 ~1XaR MӊV{JҐV{ZK-JOJx[(8+-&^0Y)Zi XVGe"QӊVJ~)aQcƱS_ZHM+)[`Q~ihWxx8:,UUj9ꟶrgX;qqR_U&>ͥ'nqVP844[ )eiG(h30FK1Tˤf4cFʸ<.9Ixײ'h,!:2Sn3PDv &4uYǩ+Os̗6SDZX۬]wiVU|2h ܅*)ef 7Ɨ&J᥋(NlƆRK^R 6@,ƕ *KK_O$U7%ZY*P`IfO @*;Fp/\%g[d3jѥ&-|x'"S{wHQ󎏠.JO8yhc-y{9W5u:X9O?Jʮ?NR-1M:Fc)ޤK1`PfƘU%:c4QƚTmyÉ.X';Q3nSSxBg}DBR{E 7e-:גWqE5/= STʼG#QDž #U47Ga\p`ѮW C {P=mȞc(1rb qgczllBl|5Pʌyqz_3n _jSqz+yG\]h8g=rkX?-vR$9u-A֋s>w {b9u;DI>p] Uw'2"5>68^jQ=8BxԚ)?X!/BAԃG" <\Zq>1OYB^PTCgDX": xHxz7~I kCyo&/OA_t,郎3/1@ ˳ɗ~A^..ρ…8hInga٢8_/)k{Ho3/0,R/6,g/ QQ|Z؟GkՔu.x.B`v2H %ҧ6xz)WNk@VL?רr{8ԠKJyâDD~wt|GŶz߽[i=۬_sV Uu/ߎ,f5 ϳnno*ͩ[1#%vޔil7e$cGZ+S8+S VmAKV=42RjF$l|Av8ӖB*M<$1,Ij , -(O) \R6=:Y.<:,UURQ&i` CF7$ *1`mB%waJ4F%k+E!J' G7ԥLgYL2dIoUZ5 9I 9 C4uu7Vk1~uڠ<(콿}1L~75S%zD{k b uJGBq r&q F/9ʱZ1̲ ޤHmH`Ƿ60 h +kbntI&KM\"ea꼫 ӌgJT:6w$]& bmu! iG_!(erj ߛbr .S*P :I4ZJds.DŽkV0]h?} i Je9 TrYif̂L3hg*Ȳ- 1C\HEt \ָ{ZȖCeېP,¹0]bM~\D:f`t`A<@n$ZeY`鋒n9 d!GW85˦$r6ټHW޵q"zrF/z(r31Ċh+Y=M.wW+GE2Vofr>P[.r@rבnF yDБ6jxwT2.)9`S7!hLI6DK%no/ni߯Ͱ'HX#gA1ڷlX|Dz_??>93./Ѕr x!Xb3h i 7/.)(j?i,*Q.V f FDc@5v;,uuc@6h6=6ڏEO ,֞u%N c1H2dJG|mQV_YsgjVz'y9^D>֧jo?MCc~m+PJu䄊 H5.-)6]H]=@fiVf\f9342J6LPeZf+ 2}UFw>+yE-C OsE$ 8HVH3!\dJdPJdBM^l F #K&e!,|a73R4"e^WD`x(<+B$=S~]g pjg ZJu@\KYʢ,HUEsLS%Hir!)S(fb`D-5/];z%p;6ָV ֻ,o5"/;lGu?JD̷4ińfJ&jl X @ /"r#gi4 !VbKj;ٛ @Jym>kgѻ@6,J0.ZJ!n[|* +蕳y%BC^F:DS۾~m»@L4&U2s0+Ёq|k{OᄹBtxjlK7y[Ray nPl GcdW.`LdןB/v؛AO>uk0%?|l|mHYR LF1m/X˽IH.;(1@XߡtQ Ұby-+~sW2gΛ,/2p _mv|\u|4f𴽅?r~ ;OǴ5G֠-]wpm n1VC\=&alYktsGy8d\'`+T*'<WORN՗W֪Ë"/yf ›|/S} WGuVD CAFhJߵ+O]G³|h8;8 MswO9!icγT ln?gG`TOAZdK7MX QJԇTkPΙ.)C*+IPfmmWi i\'2Ȧ:1'qc=M3Ы߷ww./DtHaF%v6cqோu]z6%xzz>õuylfM'|pD͢ߓLf7>7Nu7c?;'aOȾm"ݳ|3=m==Lh4-4uVo<'DMP+SxHgҊEi 1ZHKe*TFVp^ (LW)I [RE=;s̪Ju b VzXMnVzVʙr#(כRKMT(?+6!1T(?+]HMĭԫT}JvTR瞭D@&Evar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005205366015134204462017705 0ustar rootrootJan 21 15:23:54 crc systemd[1]: Starting Kubernetes Kubelet... Jan 21 15:23:54 crc restorecon[4693]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:54 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:55 crc restorecon[4693]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 15:23:55 crc restorecon[4693]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 21 15:23:55 crc kubenswrapper[4773]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 15:23:55 crc kubenswrapper[4773]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 21 15:23:55 crc kubenswrapper[4773]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 15:23:55 crc kubenswrapper[4773]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 15:23:55 crc kubenswrapper[4773]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 21 15:23:55 crc kubenswrapper[4773]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.208711 4773 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211794 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211814 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211819 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211824 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211830 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211839 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211844 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211848 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211852 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211855 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211858 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211862 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211870 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211883 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211888 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211892 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211897 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211901 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211906 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211913 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211918 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211923 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211927 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211930 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211934 4773 feature_gate.go:330] unrecognized feature gate: Example Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211937 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211941 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211944 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211948 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211951 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211956 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211960 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211964 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211968 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211972 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211976 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211980 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211984 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211987 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211992 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.211997 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212000 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212004 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212008 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212011 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212015 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212019 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212026 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212030 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212034 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212039 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212043 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212048 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212051 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212055 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212059 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212063 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212066 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212070 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212074 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212077 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212081 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212084 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212089 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212093 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212097 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212100 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212103 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212107 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212110 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.212114 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212352 4773 flags.go:64] FLAG: --address="0.0.0.0" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212363 4773 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212372 4773 flags.go:64] FLAG: --anonymous-auth="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212377 4773 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212383 4773 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212387 4773 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212393 4773 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212398 4773 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212402 4773 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212408 4773 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212412 4773 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212417 4773 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212421 4773 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212425 4773 flags.go:64] FLAG: --cgroup-root="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212429 4773 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212433 4773 flags.go:64] FLAG: --client-ca-file="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212436 4773 flags.go:64] FLAG: --cloud-config="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212441 4773 flags.go:64] FLAG: --cloud-provider="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212446 4773 flags.go:64] FLAG: --cluster-dns="[]" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212451 4773 flags.go:64] FLAG: --cluster-domain="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212455 4773 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212459 4773 flags.go:64] FLAG: --config-dir="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212463 4773 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212468 4773 flags.go:64] FLAG: --container-log-max-files="5" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212473 4773 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212477 4773 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212481 4773 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212486 4773 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212490 4773 flags.go:64] FLAG: --contention-profiling="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212494 4773 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212498 4773 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212502 4773 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212506 4773 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212511 4773 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212515 4773 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212519 4773 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212523 4773 flags.go:64] FLAG: --enable-load-reader="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212527 4773 flags.go:64] FLAG: --enable-server="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212531 4773 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212536 4773 flags.go:64] FLAG: --event-burst="100" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212541 4773 flags.go:64] FLAG: --event-qps="50" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212545 4773 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212549 4773 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212553 4773 flags.go:64] FLAG: --eviction-hard="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212558 4773 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212562 4773 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212566 4773 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212570 4773 flags.go:64] FLAG: --eviction-soft="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212574 4773 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212578 4773 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212583 4773 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212588 4773 flags.go:64] FLAG: --experimental-mounter-path="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212593 4773 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212598 4773 flags.go:64] FLAG: --fail-swap-on="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212603 4773 flags.go:64] FLAG: --feature-gates="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212610 4773 flags.go:64] FLAG: --file-check-frequency="20s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212615 4773 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212620 4773 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212625 4773 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212630 4773 flags.go:64] FLAG: --healthz-port="10248" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212635 4773 flags.go:64] FLAG: --help="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212639 4773 flags.go:64] FLAG: --hostname-override="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212643 4773 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212647 4773 flags.go:64] FLAG: --http-check-frequency="20s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212652 4773 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212656 4773 flags.go:64] FLAG: --image-credential-provider-config="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212661 4773 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212665 4773 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212669 4773 flags.go:64] FLAG: --image-service-endpoint="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212673 4773 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212677 4773 flags.go:64] FLAG: --kube-api-burst="100" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212681 4773 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212685 4773 flags.go:64] FLAG: --kube-api-qps="50" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212689 4773 flags.go:64] FLAG: --kube-reserved="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212710 4773 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212714 4773 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212718 4773 flags.go:64] FLAG: --kubelet-cgroups="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212722 4773 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212726 4773 flags.go:64] FLAG: --lock-file="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212730 4773 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212734 4773 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212738 4773 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212745 4773 flags.go:64] FLAG: --log-json-split-stream="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212749 4773 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212753 4773 flags.go:64] FLAG: --log-text-split-stream="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212757 4773 flags.go:64] FLAG: --logging-format="text" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212761 4773 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212766 4773 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212770 4773 flags.go:64] FLAG: --manifest-url="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212773 4773 flags.go:64] FLAG: --manifest-url-header="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212779 4773 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212783 4773 flags.go:64] FLAG: --max-open-files="1000000" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212788 4773 flags.go:64] FLAG: --max-pods="110" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212792 4773 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212796 4773 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212800 4773 flags.go:64] FLAG: --memory-manager-policy="None" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212803 4773 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212808 4773 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212812 4773 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212816 4773 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212826 4773 flags.go:64] FLAG: --node-status-max-images="50" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212831 4773 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212835 4773 flags.go:64] FLAG: --oom-score-adj="-999" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212839 4773 flags.go:64] FLAG: --pod-cidr="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212843 4773 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212849 4773 flags.go:64] FLAG: --pod-manifest-path="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212853 4773 flags.go:64] FLAG: --pod-max-pids="-1" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212857 4773 flags.go:64] FLAG: --pods-per-core="0" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212861 4773 flags.go:64] FLAG: --port="10250" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212865 4773 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212869 4773 flags.go:64] FLAG: --provider-id="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212873 4773 flags.go:64] FLAG: --qos-reserved="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212876 4773 flags.go:64] FLAG: --read-only-port="10255" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212881 4773 flags.go:64] FLAG: --register-node="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212885 4773 flags.go:64] FLAG: --register-schedulable="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212889 4773 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212896 4773 flags.go:64] FLAG: --registry-burst="10" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212900 4773 flags.go:64] FLAG: --registry-qps="5" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212908 4773 flags.go:64] FLAG: --reserved-cpus="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212911 4773 flags.go:64] FLAG: --reserved-memory="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212916 4773 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212920 4773 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212925 4773 flags.go:64] FLAG: --rotate-certificates="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212929 4773 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212933 4773 flags.go:64] FLAG: --runonce="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212937 4773 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212941 4773 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212945 4773 flags.go:64] FLAG: --seccomp-default="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212950 4773 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212954 4773 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212959 4773 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212963 4773 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212967 4773 flags.go:64] FLAG: --storage-driver-password="root" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212971 4773 flags.go:64] FLAG: --storage-driver-secure="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212975 4773 flags.go:64] FLAG: --storage-driver-table="stats" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212979 4773 flags.go:64] FLAG: --storage-driver-user="root" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212983 4773 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212988 4773 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212992 4773 flags.go:64] FLAG: --system-cgroups="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.212996 4773 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213002 4773 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213006 4773 flags.go:64] FLAG: --tls-cert-file="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213010 4773 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213015 4773 flags.go:64] FLAG: --tls-min-version="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213019 4773 flags.go:64] FLAG: --tls-private-key-file="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213023 4773 flags.go:64] FLAG: --topology-manager-policy="none" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213031 4773 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213034 4773 flags.go:64] FLAG: --topology-manager-scope="container" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213038 4773 flags.go:64] FLAG: --v="2" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213044 4773 flags.go:64] FLAG: --version="false" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213051 4773 flags.go:64] FLAG: --vmodule="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213056 4773 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213060 4773 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213175 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213180 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213184 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213188 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213191 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213195 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213198 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213202 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213206 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213209 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213212 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213216 4773 feature_gate.go:330] unrecognized feature gate: Example Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213219 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213223 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213226 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213231 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213235 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213240 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213244 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213248 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213252 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213256 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213259 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213263 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213267 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213272 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213276 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213280 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213283 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213289 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213292 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213295 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213299 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213303 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213306 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213309 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213313 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213316 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213320 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213324 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213327 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213330 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213334 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213337 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213340 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213344 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213347 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213351 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213354 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213358 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213361 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213364 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213368 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213371 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213375 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213378 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213382 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213388 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213392 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213396 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213400 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213405 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213410 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213413 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213417 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213421 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213425 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213429 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213433 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213438 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.213442 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.213584 4773 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.221937 4773 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.221980 4773 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222061 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222069 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222074 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222078 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222083 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222087 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222091 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222095 4773 feature_gate.go:330] unrecognized feature gate: Example Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222098 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222102 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222106 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222110 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222113 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222118 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222122 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222125 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222129 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222133 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222137 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222141 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222144 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222150 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222158 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222163 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222168 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222172 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222176 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222180 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222184 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222188 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222191 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222196 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222202 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222206 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222209 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222214 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222219 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222223 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222227 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222231 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222234 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222238 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222242 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222247 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222251 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222254 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222258 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222262 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222266 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222270 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222274 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222278 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222282 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222286 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222290 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222294 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222298 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222301 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222305 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222308 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222312 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222316 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222319 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222323 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222326 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222330 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222333 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222336 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222340 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222344 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222348 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.222356 4773 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222501 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222510 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222516 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222520 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222525 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222529 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222533 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222537 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222540 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222544 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222548 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222552 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222556 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222560 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222566 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222570 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222574 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222579 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222582 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222586 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222590 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222593 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222597 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222601 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222605 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222608 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222612 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222615 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222619 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222623 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222627 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222631 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222634 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222638 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222641 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222645 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222648 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222651 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222656 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222660 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222665 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222668 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222673 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222677 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222681 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222684 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222688 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222696 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222714 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222719 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222722 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222726 4773 feature_gate.go:330] unrecognized feature gate: Example Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222729 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222733 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222736 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222740 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222743 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222748 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222751 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222755 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222760 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222765 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222770 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222774 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222778 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222782 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222786 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222790 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222794 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222799 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.222802 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.222809 4773 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.222990 4773 server.go:940] "Client rotation is on, will bootstrap in background" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.225389 4773 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.225464 4773 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.225908 4773 server.go:997] "Starting client certificate rotation" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.225932 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.226202 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-05 03:37:36.821796777 +0000 UTC Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.226331 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.231428 4773 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.233148 4773 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.99:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.233725 4773 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.242806 4773 log.go:25] "Validated CRI v1 runtime API" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.255772 4773 log.go:25] "Validated CRI v1 image API" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.257235 4773 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.258970 4773 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-21-15-19-18-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.259051 4773 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.275924 4773 manager.go:217] Machine: {Timestamp:2026-01-21 15:23:55.274460365 +0000 UTC m=+0.198950027 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b60d999f-1a4a-45e9-ae91-551ff743d8e2 BootID:6f2b8693-eb75-45a0-8d77-3f0db13277ea Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:99:c7:3a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:99:c7:3a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:94:b8:33 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:21:6b:d7 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:28:3a:d2 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c8:fb:2f Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e2:b6:0d:dc:48:a3 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3a:0d:17:71:dc:fd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.276202 4773 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.276364 4773 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.276844 4773 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.277046 4773 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.277092 4773 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.277317 4773 topology_manager.go:138] "Creating topology manager with none policy" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.277331 4773 container_manager_linux.go:303] "Creating device plugin manager" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.277532 4773 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.277564 4773 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.277919 4773 state_mem.go:36] "Initialized new in-memory state store" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.278013 4773 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.279242 4773 kubelet.go:418] "Attempting to sync node with API server" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.279267 4773 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.279297 4773 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.279313 4773 kubelet.go:324] "Adding apiserver pod source" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.279326 4773 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.281062 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.99:6443: connect: connection refused Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.281181 4773 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.281060 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.99:6443: connect: connection refused Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.281326 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.99:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.281276 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.99:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.281548 4773 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.282324 4773 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.282952 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.282983 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.282996 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.283005 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.283020 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.283030 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.283042 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.283060 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.283075 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.283096 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.283112 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.283123 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.283346 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.283841 4773 server.go:1280] "Started kubelet" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.284496 4773 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.284495 4773 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.284576 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.99:6443: connect: connection refused Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.285094 4773 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 21 15:23:55 crc systemd[1]: Started Kubernetes Kubelet. Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.286170 4773 server.go:460] "Adding debug handlers to kubelet server" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.286269 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.99:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188cc858508c5ca0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 15:23:55.283815584 +0000 UTC m=+0.208305226,LastTimestamp:2026-01-21 15:23:55.283815584 +0000 UTC m=+0.208305226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.288448 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.288487 4773 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.288597 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 02:38:49.966716654 +0000 UTC Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.288862 4773 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.288884 4773 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.289921 4773 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.290373 4773 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.291003 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" interval="200ms" Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.291068 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.99:6443: connect: connection refused Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.291133 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.99:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.293197 4773 factory.go:153] Registering CRI-O factory Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.293233 4773 factory.go:221] Registration of the crio container factory successfully Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.293314 4773 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.293325 4773 factory.go:55] Registering systemd factory Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.293340 4773 factory.go:221] Registration of the systemd container factory successfully Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.293362 4773 factory.go:103] Registering Raw factory Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.293380 4773 manager.go:1196] Started watching for new ooms in manager Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.293993 4773 manager.go:319] Starting recovery of all containers Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300326 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300635 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300648 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300661 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300672 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300682 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300695 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300735 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300751 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300785 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300796 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300806 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300815 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300826 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300852 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300860 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300868 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300877 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300885 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300910 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300920 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300929 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300939 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300949 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300958 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300968 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.300978 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.301032 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.301045 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.301054 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.301064 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303037 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303060 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303075 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303084 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303095 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303105 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303114 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303124 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303134 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303167 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303191 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303204 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303213 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303223 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303234 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303243 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303253 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303263 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303272 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303282 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303291 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303303 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.303313 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304040 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304062 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304072 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304082 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304093 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304105 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304115 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304124 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304133 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304141 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304150 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304160 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304169 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304179 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304188 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304196 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304205 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304213 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304224 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304233 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304242 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304251 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304260 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304268 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304278 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304287 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304300 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304309 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304318 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304330 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304339 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304349 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304358 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304381 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304391 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304400 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304419 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304429 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304437 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304446 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304455 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304464 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304475 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304483 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304492 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304506 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304515 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304524 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304532 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304541 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304556 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304565 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304576 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304586 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304595 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304606 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304615 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304625 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304635 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304644 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304652 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304665 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304673 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304681 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304694 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304717 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304725 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304733 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304742 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304753 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304765 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304777 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304788 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304797 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304808 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304818 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304827 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304837 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304846 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304856 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304865 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304875 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304884 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304895 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304904 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304913 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304923 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304932 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304941 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304950 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304960 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304969 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304980 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.304996 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305008 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305020 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305032 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305042 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305055 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305067 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305122 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305136 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305149 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305159 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305170 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305182 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305194 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305207 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305219 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305234 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305245 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305258 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305268 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305277 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305287 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.305296 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.306823 4773 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.306883 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.306903 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.306918 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.306933 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.306951 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.306965 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.306979 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.306994 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307010 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307023 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307036 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307049 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307072 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307086 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307099 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307116 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307129 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307144 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307157 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307173 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307194 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307211 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307228 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307245 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307260 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307294 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307307 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307321 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307335 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307350 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307365 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307380 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307395 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307409 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307425 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307438 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307450 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307461 4773 reconstruct.go:97] "Volume reconstruction finished" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.307470 4773 reconciler.go:26] "Reconciler: start to sync state" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.310536 4773 manager.go:324] Recovery completed Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.319456 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.320735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.320848 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.320926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.325503 4773 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.325527 4773 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.325556 4773 state_mem.go:36] "Initialized new in-memory state store" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.378788 4773 policy_none.go:49] "None policy: Start" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.380761 4773 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.380861 4773 state_mem.go:35] "Initializing new in-memory state store" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.381015 4773 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.382424 4773 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.382468 4773 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.382495 4773 kubelet.go:2335] "Starting kubelet main sync loop" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.382804 4773 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.383270 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.99:6443: connect: connection refused Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.383882 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.99:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.390929 4773 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.437071 4773 manager.go:334] "Starting Device Plugin manager" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.437304 4773 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.437320 4773 server.go:79] "Starting device plugin registration server" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.437716 4773 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.437736 4773 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.437965 4773 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.438064 4773 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.438074 4773 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.448246 4773 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.483718 4773 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.483815 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.484973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.485018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.485031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.485170 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.485989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.486017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.486027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.486093 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.486120 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.486145 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.486288 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.486339 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.486943 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.486987 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487004 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487135 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487214 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487253 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487315 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487364 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487933 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.487945 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.488064 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.488073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.488092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.488103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.488317 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.488356 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.488943 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.489141 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.489275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.489289 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.489256 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.489320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.490916 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.490977 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.491780 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" interval="400ms" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.493363 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.493394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.493407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509420 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509471 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509506 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509539 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509570 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509591 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509650 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509771 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509814 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509897 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509968 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.509993 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.510013 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.510034 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.537999 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.539610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.539659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.539671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.539725 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.540279 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.99:6443: connect: connection refused" node="crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.610817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.610876 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.610916 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.610950 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.610981 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611008 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611005 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611007 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611056 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611059 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611012 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611075 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611033 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611093 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611158 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611194 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611223 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611227 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611250 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611277 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611302 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611326 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611343 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611301 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611345 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611330 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611371 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611438 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611478 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.611373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.740830 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.742204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.742256 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.742302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.742339 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.742930 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.99:6443: connect: connection refused" node="crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.834103 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.843204 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.855456 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-58e554af56133f9374c602702238b7b6af225f7c07606f0e07283c7f62dec4e2 WatchSource:0}: Error finding container 58e554af56133f9374c602702238b7b6af225f7c07606f0e07283c7f62dec4e2: Status 404 returned error can't find the container with id 58e554af56133f9374c602702238b7b6af225f7c07606f0e07283c7f62dec4e2 Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.861142 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-b503ec5a69039aea8206f8f32917c56eff72a839241c7a8c60f9efe95829b8aa WatchSource:0}: Error finding container b503ec5a69039aea8206f8f32917c56eff72a839241c7a8c60f9efe95829b8aa: Status 404 returned error can't find the container with id b503ec5a69039aea8206f8f32917c56eff72a839241c7a8c60f9efe95829b8aa Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.865873 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.884349 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-14662f3b0a36b493f104184fccdd3453c1649f4e3bda801904ea23235a2e6f8d WatchSource:0}: Error finding container 14662f3b0a36b493f104184fccdd3453c1649f4e3bda801904ea23235a2e6f8d: Status 404 returned error can't find the container with id 14662f3b0a36b493f104184fccdd3453c1649f4e3bda801904ea23235a2e6f8d Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.889558 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: E0121 15:23:55.893187 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" interval="800ms" Jan 21 15:23:55 crc kubenswrapper[4773]: I0121 15:23:55.897406 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.905641 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-87ed0f85068214e3dbe3f453012756b4f6539226bf102e6de1e8782bf9ae364d WatchSource:0}: Error finding container 87ed0f85068214e3dbe3f453012756b4f6539226bf102e6de1e8782bf9ae364d: Status 404 returned error can't find the container with id 87ed0f85068214e3dbe3f453012756b4f6539226bf102e6de1e8782bf9ae364d Jan 21 15:23:55 crc kubenswrapper[4773]: W0121 15:23:55.912108 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-669ea733ad1dbe037ed353915edcee753c7a7ae784542964e9d74752d529d7ad WatchSource:0}: Error finding container 669ea733ad1dbe037ed353915edcee753c7a7ae784542964e9d74752d529d7ad: Status 404 returned error can't find the container with id 669ea733ad1dbe037ed353915edcee753c7a7ae784542964e9d74752d529d7ad Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.143467 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.144673 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.144726 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.144738 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.144765 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:23:56 crc kubenswrapper[4773]: E0121 15:23:56.145236 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.99:6443: connect: connection refused" node="crc" Jan 21 15:23:56 crc kubenswrapper[4773]: W0121 15:23:56.271640 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.99:6443: connect: connection refused Jan 21 15:23:56 crc kubenswrapper[4773]: E0121 15:23:56.271774 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.99:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.285969 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.99:6443: connect: connection refused Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.289045 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 18:23:17.291112537 +0000 UTC Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.387447 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc" exitCode=0 Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.387508 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc"} Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.387586 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"14662f3b0a36b493f104184fccdd3453c1649f4e3bda801904ea23235a2e6f8d"} Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.387719 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.388834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.388913 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.388938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.390236 4773 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="522aa2f92b15aea039b2284fcdc570abc83a76308f412a0a8e6ad57b65698255" exitCode=0 Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.390440 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"522aa2f92b15aea039b2284fcdc570abc83a76308f412a0a8e6ad57b65698255"} Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.390529 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b503ec5a69039aea8206f8f32917c56eff72a839241c7a8c60f9efe95829b8aa"} Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.390754 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.392653 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.392752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.392815 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.392833 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.393487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.393529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.393543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.394114 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c" exitCode=0 Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.394193 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c"} Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.394218 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"58e554af56133f9374c602702238b7b6af225f7c07606f0e07283c7f62dec4e2"} Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.394305 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.394873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.394931 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.394952 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.395950 4773 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c" exitCode=0 Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.396033 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c"} Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.396086 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"669ea733ad1dbe037ed353915edcee753c7a7ae784542964e9d74752d529d7ad"} Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.396178 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.397034 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.397063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.397076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.397919 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a"} Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.397972 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"87ed0f85068214e3dbe3f453012756b4f6539226bf102e6de1e8782bf9ae364d"} Jan 21 15:23:56 crc kubenswrapper[4773]: W0121 15:23:56.402059 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.99:6443: connect: connection refused Jan 21 15:23:56 crc kubenswrapper[4773]: E0121 15:23:56.402122 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.99:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:23:56 crc kubenswrapper[4773]: W0121 15:23:56.583383 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.99:6443: connect: connection refused Jan 21 15:23:56 crc kubenswrapper[4773]: E0121 15:23:56.583461 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.99:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:23:56 crc kubenswrapper[4773]: E0121 15:23:56.694606 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" interval="1.6s" Jan 21 15:23:56 crc kubenswrapper[4773]: W0121 15:23:56.796109 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.99:6443: connect: connection refused Jan 21 15:23:56 crc kubenswrapper[4773]: E0121 15:23:56.796204 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.99:6443: connect: connection refused" logger="UnhandledError" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.945560 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.946927 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.946962 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.946970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:56 crc kubenswrapper[4773]: I0121 15:23:56.946992 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:23:56 crc kubenswrapper[4773]: E0121 15:23:56.950324 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.99:6443: connect: connection refused" node="crc" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.241826 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.289164 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 23:28:50.570223559 +0000 UTC Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.402566 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d2c0a968ae28c1f228af6e4ee3cd2eed0a13b2398efe4cbab048c6e973746924"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.402626 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.402639 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a91e99071784df2b01fef79a6d8bbcef3db7b0f853f6523009acfc83c23e2c3e"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.402651 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7177d316f870571dc5481c145d33cb7045938fe0537f51543f83991cdc44ebd7"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.404264 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.404290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.404303 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.408065 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.408116 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.408119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.408134 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.409043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.409071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.409079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.411305 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.411330 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.411342 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.411516 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.411526 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.411643 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.412324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.412351 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.412361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.413721 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cdc67c3f5d5be1bfbfc1ca14dcd69e5ed4fa055248bb843c3547cc07766227fd"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.413778 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.414318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.414353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.414362 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.415102 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276" exitCode=0 Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.415134 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276"} Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.415287 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.415900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.415926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.415935 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:57 crc kubenswrapper[4773]: I0121 15:23:57.900772 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.290349 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 11:39:35.243678115 +0000 UTC Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.421353 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403" exitCode=0 Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.421430 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403"} Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.421455 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.421483 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.421483 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.421600 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.422557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.422603 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.422616 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.423227 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.423252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.423267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.423283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.423273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.423300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.550799 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.552500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.552540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.552550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:23:58 crc kubenswrapper[4773]: I0121 15:23:58.552577 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:23:59 crc kubenswrapper[4773]: I0121 15:23:59.205785 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:23:59 crc kubenswrapper[4773]: I0121 15:23:59.291600 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 14:33:44.005249771 +0000 UTC Jan 21 15:23:59 crc kubenswrapper[4773]: I0121 15:23:59.427769 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58"} Jan 21 15:23:59 crc kubenswrapper[4773]: I0121 15:23:59.427844 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:23:59 crc kubenswrapper[4773]: I0121 15:23:59.427865 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602"} Jan 21 15:23:59 crc kubenswrapper[4773]: I0121 15:23:59.427895 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247"} Jan 21 15:23:59 crc kubenswrapper[4773]: I0121 15:23:59.427961 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12"} Jan 21 15:23:59 crc kubenswrapper[4773]: I0121 15:23:59.428848 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:23:59 crc kubenswrapper[4773]: I0121 15:23:59.428908 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:23:59 crc kubenswrapper[4773]: I0121 15:23:59.428921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.128061 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.128232 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.129498 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.129541 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.129556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.292265 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 05:35:58.07717184 +0000 UTC Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.439955 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.440191 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4"} Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.440256 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.441309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.441337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.441377 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.441347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.441453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.441401 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.462034 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.462191 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.462236 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.463830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.463896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.463921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:00 crc kubenswrapper[4773]: I0121 15:24:00.653478 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:24:01 crc kubenswrapper[4773]: I0121 15:24:01.293328 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 20:04:46.241266261 +0000 UTC Jan 21 15:24:01 crc kubenswrapper[4773]: I0121 15:24:01.442482 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:01 crc kubenswrapper[4773]: I0121 15:24:01.442559 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:01 crc kubenswrapper[4773]: I0121 15:24:01.443669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:01 crc kubenswrapper[4773]: I0121 15:24:01.443727 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:01 crc kubenswrapper[4773]: I0121 15:24:01.443740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:01 crc kubenswrapper[4773]: I0121 15:24:01.444168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:01 crc kubenswrapper[4773]: I0121 15:24:01.444253 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:01 crc kubenswrapper[4773]: I0121 15:24:01.444273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:01 crc kubenswrapper[4773]: I0121 15:24:01.471894 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:24:02 crc kubenswrapper[4773]: I0121 15:24:02.293593 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 00:52:13.960186779 +0000 UTC Jan 21 15:24:02 crc kubenswrapper[4773]: I0121 15:24:02.446148 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:02 crc kubenswrapper[4773]: I0121 15:24:02.448548 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:02 crc kubenswrapper[4773]: I0121 15:24:02.448622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:02 crc kubenswrapper[4773]: I0121 15:24:02.448639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:02 crc kubenswrapper[4773]: I0121 15:24:02.940752 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 21 15:24:02 crc kubenswrapper[4773]: I0121 15:24:02.940992 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:02 crc kubenswrapper[4773]: I0121 15:24:02.942102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:02 crc kubenswrapper[4773]: I0121 15:24:02.942138 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:02 crc kubenswrapper[4773]: I0121 15:24:02.942148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:03 crc kubenswrapper[4773]: I0121 15:24:03.294687 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 22:25:29.839949324 +0000 UTC Jan 21 15:24:03 crc kubenswrapper[4773]: I0121 15:24:03.511608 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 21 15:24:03 crc kubenswrapper[4773]: I0121 15:24:03.511977 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:03 crc kubenswrapper[4773]: I0121 15:24:03.513414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:03 crc kubenswrapper[4773]: I0121 15:24:03.513452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:03 crc kubenswrapper[4773]: I0121 15:24:03.513463 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:04 crc kubenswrapper[4773]: I0121 15:24:04.295541 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 13:23:43.268642315 +0000 UTC Jan 21 15:24:04 crc kubenswrapper[4773]: I0121 15:24:04.335032 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:24:04 crc kubenswrapper[4773]: I0121 15:24:04.335256 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:04 crc kubenswrapper[4773]: I0121 15:24:04.337056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:04 crc kubenswrapper[4773]: I0121 15:24:04.337363 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:04 crc kubenswrapper[4773]: I0121 15:24:04.337576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:04 crc kubenswrapper[4773]: I0121 15:24:04.340194 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:24:04 crc kubenswrapper[4773]: I0121 15:24:04.450219 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:04 crc kubenswrapper[4773]: I0121 15:24:04.451231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:04 crc kubenswrapper[4773]: I0121 15:24:04.451274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:04 crc kubenswrapper[4773]: I0121 15:24:04.451283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:05 crc kubenswrapper[4773]: I0121 15:24:05.296688 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 20:00:20.936847732 +0000 UTC Jan 21 15:24:05 crc kubenswrapper[4773]: E0121 15:24:05.448490 4773 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 21 15:24:06 crc kubenswrapper[4773]: I0121 15:24:06.296836 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 02:50:55.45570895 +0000 UTC Jan 21 15:24:06 crc kubenswrapper[4773]: I0121 15:24:06.513804 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:24:06 crc kubenswrapper[4773]: I0121 15:24:06.514031 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:06 crc kubenswrapper[4773]: I0121 15:24:06.515926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:06 crc kubenswrapper[4773]: I0121 15:24:06.516011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:06 crc kubenswrapper[4773]: I0121 15:24:06.516039 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:06 crc kubenswrapper[4773]: I0121 15:24:06.520556 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:24:07 crc kubenswrapper[4773]: E0121 15:24:07.243509 4773 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 21 15:24:07 crc kubenswrapper[4773]: I0121 15:24:07.286887 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 21 15:24:07 crc kubenswrapper[4773]: I0121 15:24:07.297368 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 17:01:50.499567288 +0000 UTC Jan 21 15:24:07 crc kubenswrapper[4773]: I0121 15:24:07.458141 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:07 crc kubenswrapper[4773]: I0121 15:24:07.458974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:07 crc kubenswrapper[4773]: I0121 15:24:07.459016 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:07 crc kubenswrapper[4773]: I0121 15:24:07.459028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:08 crc kubenswrapper[4773]: I0121 15:24:08.048635 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 21 15:24:08 crc kubenswrapper[4773]: I0121 15:24:08.048757 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 21 15:24:08 crc kubenswrapper[4773]: I0121 15:24:08.054704 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 21 15:24:08 crc kubenswrapper[4773]: I0121 15:24:08.054764 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 21 15:24:08 crc kubenswrapper[4773]: I0121 15:24:08.298363 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 09:01:32.603369189 +0000 UTC Jan 21 15:24:09 crc kubenswrapper[4773]: I0121 15:24:09.299173 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 10:05:30.960647825 +0000 UTC Jan 21 15:24:09 crc kubenswrapper[4773]: I0121 15:24:09.513916 4773 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 15:24:09 crc kubenswrapper[4773]: I0121 15:24:09.514037 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 15:24:10 crc kubenswrapper[4773]: I0121 15:24:10.300133 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 21:49:01.309766504 +0000 UTC Jan 21 15:24:11 crc kubenswrapper[4773]: I0121 15:24:11.301147 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 04:27:26.027863429 +0000 UTC Jan 21 15:24:11 crc kubenswrapper[4773]: I0121 15:24:11.480530 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:24:11 crc kubenswrapper[4773]: I0121 15:24:11.480689 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:11 crc kubenswrapper[4773]: I0121 15:24:11.481777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:11 crc kubenswrapper[4773]: I0121 15:24:11.481969 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:11 crc kubenswrapper[4773]: I0121 15:24:11.482102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:11 crc kubenswrapper[4773]: I0121 15:24:11.486798 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:24:11 crc kubenswrapper[4773]: I0121 15:24:11.525980 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 15:24:11 crc kubenswrapper[4773]: I0121 15:24:11.543206 4773 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 21 15:24:12 crc kubenswrapper[4773]: I0121 15:24:12.301885 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 16:06:28.149142356 +0000 UTC Jan 21 15:24:12 crc kubenswrapper[4773]: I0121 15:24:12.469477 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:24:12 crc kubenswrapper[4773]: I0121 15:24:12.469523 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:12 crc kubenswrapper[4773]: I0121 15:24:12.470123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:12 crc kubenswrapper[4773]: I0121 15:24:12.470153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:12 crc kubenswrapper[4773]: I0121 15:24:12.470162 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.055840 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.057226 4773 trace.go:236] Trace[479736716]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 15:23:59.444) (total time: 13612ms): Jan 21 15:24:13 crc kubenswrapper[4773]: Trace[479736716]: ---"Objects listed" error: 13612ms (15:24:13.057) Jan 21 15:24:13 crc kubenswrapper[4773]: Trace[479736716]: [13.612255737s] [13.612255737s] END Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.057455 4773 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.058485 4773 trace.go:236] Trace[79018606]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 15:23:59.333) (total time: 13724ms): Jan 21 15:24:13 crc kubenswrapper[4773]: Trace[79018606]: ---"Objects listed" error: 13724ms (15:24:13.058) Jan 21 15:24:13 crc kubenswrapper[4773]: Trace[79018606]: [13.724756811s] [13.724756811s] END Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.058508 4773 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.059159 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.059266 4773 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.059328 4773 trace.go:236] Trace[796728857]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 15:23:58.849) (total time: 14209ms): Jan 21 15:24:13 crc kubenswrapper[4773]: Trace[796728857]: ---"Objects listed" error: 14209ms (15:24:13.059) Jan 21 15:24:13 crc kubenswrapper[4773]: Trace[796728857]: [14.209730928s] [14.209730928s] END Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.059366 4773 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.060156 4773 trace.go:236] Trace[403686552]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 15:23:59.174) (total time: 13885ms): Jan 21 15:24:13 crc kubenswrapper[4773]: Trace[403686552]: ---"Objects listed" error: 13885ms (15:24:13.059) Jan 21 15:24:13 crc kubenswrapper[4773]: Trace[403686552]: [13.885895597s] [13.885895597s] END Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.060179 4773 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.076102 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38622->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.076171 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38622->192.168.126.11:17697: read: connection reset by peer" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.076269 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58366->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.076287 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58366->192.168.126.11:17697: read: connection reset by peer" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.076504 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.076525 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.289437 4773 apiserver.go:52] "Watching apiserver" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.291461 4773 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.291734 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.292302 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.292364 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.292422 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.292709 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.292748 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.292800 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.292843 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.292909 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.293135 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.294711 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.294869 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.295751 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.295968 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.296344 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.296517 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.297719 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.302769 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.303703 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.305059 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 10:06:15.181804731 +0000 UTC Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.336094 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.351780 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.363752 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.373802 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.382598 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.393616 4773 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.393587 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.404100 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.416384 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.423823 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.433774 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.444051 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.460939 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462108 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462280 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462363 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462431 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462511 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462661 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462764 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462835 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462654 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462767 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462796 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.462894 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:13.962871899 +0000 UTC m=+18.887361541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.462986 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463020 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463047 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463087 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463121 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463145 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463168 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463189 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463213 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463237 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463259 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463285 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463307 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463329 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463351 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463352 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463376 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463358 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463401 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463424 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463448 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463469 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463494 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463673 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463722 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463746 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463769 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463790 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463811 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463838 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463861 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463885 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463921 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463941 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463963 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463987 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464011 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464033 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464066 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464095 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464129 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464185 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464207 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464228 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464249 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464272 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464325 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464379 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464402 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464427 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464448 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464470 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464491 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464513 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464534 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464557 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464580 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464604 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464625 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464647 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464670 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464714 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464756 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464795 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464821 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464844 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464868 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464893 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464915 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464935 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464958 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464980 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465009 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465030 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465056 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465104 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465130 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465154 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465188 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465218 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465248 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465274 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465306 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465342 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465364 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465386 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465409 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465430 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465454 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465478 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465500 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465524 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465548 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465572 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465595 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465617 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465641 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465664 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465687 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465733 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465756 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465779 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465801 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465825 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465850 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465872 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465895 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465917 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465940 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465963 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465985 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466008 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466031 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466057 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466090 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466119 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466144 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466165 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466188 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466212 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466237 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466260 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466283 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466307 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466330 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466354 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466390 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466423 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466458 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466520 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466550 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466576 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466600 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466625 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466690 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466836 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466862 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466887 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466910 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466935 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466968 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466993 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467015 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467038 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467068 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467098 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467120 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467146 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467168 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467192 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467215 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467237 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467261 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467286 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467309 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467332 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467355 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467378 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467420 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467460 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467489 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467512 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467535 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467561 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467587 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467612 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467636 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467680 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467739 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467766 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467791 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467817 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467850 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467877 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467899 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467923 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.467976 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468002 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468025 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468054 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468088 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468114 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468141 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468166 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468191 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468215 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468239 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468265 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468379 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468424 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468462 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468501 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468545 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468588 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468624 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468666 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468725 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468766 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468813 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468851 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468887 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468933 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.469034 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.469067 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.469090 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.469108 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.469127 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.463963 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464000 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464127 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464174 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464374 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464422 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464575 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464740 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.464885 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465283 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.474559 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.474564 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465671 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465923 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.466468 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468302 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.468655 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.469415 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.469872 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.470003 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.470126 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.470180 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.470470 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.470682 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.470826 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.470960 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.471111 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.471213 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.474753 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.471394 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.471523 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.471598 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.472195 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.472625 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.472983 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473057 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473085 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473258 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473361 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473391 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473441 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473544 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473595 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473680 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473684 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473742 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473794 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473856 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.473881 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.474181 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.474227 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.474853 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.474234 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.474390 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.474405 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.474408 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.465423 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.474924 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.475026 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.475028 4773 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.475067 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.475087 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.475145 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:13.975120014 +0000 UTC m=+18.899609676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.475202 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.475268 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.475359 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.475369 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.475551 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.475658 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.475741 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.475911 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.476075 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.476157 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.476250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.476448 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.476467 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.476494 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837" exitCode=255 Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.476522 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.476525 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837"} Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.476648 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.476768 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.477298 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.477394 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.477889 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.478417 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.479749 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.481734 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.482625 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.483098 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.484427 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.484809 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.485923 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.486005 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.486754 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.487162 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.487201 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.487446 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.488168 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.488272 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.488415 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.488503 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:13.988463548 +0000 UTC m=+18.912953270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.489047 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.489165 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.489229 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.489961 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.490032 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.490078 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.490243 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.492127 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.492587 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.493120 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.493671 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.494058 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.494171 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.494464 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.494643 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.494685 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.495347 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.495826 4773 scope.go:117] "RemoveContainer" containerID="37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.495889 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.496282 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.496826 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.497157 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.497287 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.497833 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.499037 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.499178 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.499180 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.499197 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.499213 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.499295 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:13.999277005 +0000 UTC m=+18.923766717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.499443 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.499566 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.499901 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.500034 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.500294 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.500710 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.500804 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.500839 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.500931 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.501029 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.501094 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.501302 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.501370 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.501432 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.501765 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.501775 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.501796 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.501985 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.502053 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.502058 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.502483 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.502740 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.502843 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.503417 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.503122 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.503811 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.503994 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.504720 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.504735 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.505217 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.505410 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:14.005370263 +0000 UTC m=+18.929859965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.505452 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.504313 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.504339 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.504355 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.505497 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.505513 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.505521 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.505610 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.505779 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.505787 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.505833 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.505972 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.506989 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.506208 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.506404 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.506405 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.506800 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.506822 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.504294 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.507061 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.507204 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.507363 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.507385 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.507429 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.507633 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.507645 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.508086 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.508236 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.508638 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.508787 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.509101 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.508837 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.509475 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.509580 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.510150 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.510237 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.510344 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.510525 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.510773 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.510645 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.510832 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.510884 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.511414 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.512441 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.513072 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.511672 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.514381 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.514851 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.515408 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.517171 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.519791 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.521375 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.532640 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.532826 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.533538 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.534679 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.545480 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.547344 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.558649 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.567436 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.569858 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.569894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.569947 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.569958 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.569968 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.569969 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.569977 4773 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570016 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570025 4773 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570035 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570044 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570053 4773 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570061 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570069 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570077 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570084 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570092 4773 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570101 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570109 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570118 4773 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570127 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570135 4773 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570143 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570151 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570159 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570093 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570167 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570190 4773 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570200 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570210 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570218 4773 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570225 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570234 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570242 4773 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570251 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570261 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570272 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570283 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570296 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570307 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570320 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570331 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570341 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570352 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570362 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570372 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570383 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570394 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570403 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570415 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570425 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570440 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570450 4773 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570460 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570471 4773 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570482 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570491 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570502 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570514 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570528 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570541 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570552 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570562 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570573 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570584 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570594 4773 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570604 4773 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570616 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570626 4773 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570636 4773 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570646 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570657 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570667 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570677 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570687 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570716 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570726 4773 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570736 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570746 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570757 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570767 4773 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570778 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570792 4773 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570801 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570811 4773 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570820 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570830 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570839 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570849 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570859 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570869 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570881 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570891 4773 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570902 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570911 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570921 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570932 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570942 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570955 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570965 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570975 4773 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570985 4773 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.570998 4773 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571008 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571019 4773 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571029 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571039 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571049 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571059 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571069 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571080 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571090 4773 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571101 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571113 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571124 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571135 4773 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571146 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571157 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571168 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571179 4773 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571190 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571200 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571211 4773 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571223 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571234 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571245 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571255 4773 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571265 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571276 4773 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571287 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571300 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571311 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571323 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571333 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571344 4773 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571356 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571366 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571377 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571387 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571398 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571408 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571419 4773 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571430 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571442 4773 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571453 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571464 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571474 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571484 4773 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571495 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571505 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571518 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571528 4773 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571540 4773 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571551 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571562 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571573 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571583 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571593 4773 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571602 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571613 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571624 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571634 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571645 4773 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571655 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571667 4773 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571678 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571708 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571721 4773 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571733 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571745 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571756 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571767 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571778 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571788 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571799 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571810 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571821 4773 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571834 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571846 4773 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571858 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571868 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571879 4773 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571889 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571901 4773 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571912 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571924 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571936 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571947 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571958 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571968 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571979 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.571990 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.572001 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.578410 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.588292 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.597586 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.598941 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.610195 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.612831 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.620303 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.623218 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.630079 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.632868 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.638544 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 15:24:13 crc kubenswrapper[4773]: I0121 15:24:13.974484 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:13 crc kubenswrapper[4773]: E0121 15:24:13.974659 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:14.974641455 +0000 UTC m=+19.899131077 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.076256 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.076311 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.076338 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.076365 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076382 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076406 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076417 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076423 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076459 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:15.076446004 +0000 UTC m=+20.000935616 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076472 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:15.076466215 +0000 UTC m=+20.000955837 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076489 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076511 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076523 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076556 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:15.076545167 +0000 UTC m=+20.001034789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076493 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.076593 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:15.076584868 +0000 UTC m=+20.001074490 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.305675 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 21:20:40.050580068 +0000 UTC Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.383510 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.383660 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.480396 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17"} Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.480455 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ac8bad4df32d991cfd9b0d1b780494e7b7797b32c46b067824880b0ea8b4f363"} Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.482345 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.483769 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138"} Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.484051 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.484738 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b8eab5a74e4e0f9d58524dc897c8322185fcacc1ff729303a3942d0c1c374fd3"} Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.486118 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186"} Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.486159 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529"} Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.486172 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ebe829e525bd61af795d0b4e25a05bac4b33e6d95444895b052445bcce81f0c4"} Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.494488 4773 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.503532 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.523543 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.538873 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.555409 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.569239 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.589240 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.602895 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.623574 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.639314 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.654062 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.671904 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.683641 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.701833 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.719982 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.739438 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.758213 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:14Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:14 crc kubenswrapper[4773]: I0121 15:24:14.983194 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:14 crc kubenswrapper[4773]: E0121 15:24:14.983395 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:16.983370968 +0000 UTC m=+21.907860590 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.083802 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.083857 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.083886 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.083914 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.083993 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.084013 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.084058 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:17.084040312 +0000 UTC m=+22.008529934 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.084084 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.084100 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.084111 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.084028 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.084154 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:17.084091443 +0000 UTC m=+22.008581075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.084164 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.084173 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.084175 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:17.084167635 +0000 UTC m=+22.008657257 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.084214 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:17.084203876 +0000 UTC m=+22.008693578 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.305957 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 09:27:20.332916938 +0000 UTC Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.383597 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.383769 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.383895 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:15 crc kubenswrapper[4773]: E0121 15:24:15.384028 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.388317 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.389004 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.390192 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.390888 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.391852 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.392480 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.393139 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.394175 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.394902 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.395785 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.396261 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.397265 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.397740 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.398245 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.399171 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.399798 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.400801 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.401305 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.402310 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.403529 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.404097 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.405176 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.405399 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.406218 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.407632 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.408194 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.409020 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.410394 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.411001 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.412215 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.412842 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.413878 4773 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.413999 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.416199 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.417315 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.417832 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.420024 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.420984 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.421882 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.422511 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.423509 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.423660 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.424165 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.425131 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.425757 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.426688 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.427168 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.428019 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.428513 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.429620 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.430113 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.430915 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.431370 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.432505 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.433232 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.434060 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.435721 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.449192 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.462869 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.476784 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.487509 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:15 crc kubenswrapper[4773]: I0121 15:24:15.497818 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.260118 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.261798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.261838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.261846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.261904 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.267765 4773 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.268007 4773 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.268929 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.269069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.269130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.269194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.269262 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: E0121 15:24:16.286869 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.289718 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.289753 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.289763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.289778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.289787 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: E0121 15:24:16.300193 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.303468 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.303605 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.303677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.303773 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.303831 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.307053 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 07:35:51.922078001 +0000 UTC Jan 21 15:24:16 crc kubenswrapper[4773]: E0121 15:24:16.314029 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.317196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.317285 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.317297 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.317310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.317320 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: E0121 15:24:16.327785 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.330760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.330992 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.331089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.331156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.331219 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: E0121 15:24:16.341845 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: E0121 15:24:16.341993 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.343391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.343421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.343433 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.343447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.343458 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.382907 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:16 crc kubenswrapper[4773]: E0121 15:24:16.383057 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.446239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.446291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.446305 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.446326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.446341 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.493124 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d"} Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.513479 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.517205 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.520653 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.524609 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.528615 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.540752 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.548538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.548584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.548595 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.548613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.548624 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.551897 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.562753 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.575313 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.585741 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.597615 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.608734 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.632555 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.647386 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.650859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.650898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.650912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.650927 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.650939 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.660463 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.672967 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.684665 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.697769 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.709253 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.718386 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:16Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.753349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.753385 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.753397 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.753413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.753423 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.856220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.856270 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.856282 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.856304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.856319 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.958494 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.958531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.958542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.958555 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.958564 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:16Z","lastTransitionTime":"2026-01-21T15:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:16 crc kubenswrapper[4773]: I0121 15:24:16.998997 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:16 crc kubenswrapper[4773]: E0121 15:24:16.999112 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:20.999090535 +0000 UTC m=+25.923580167 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.060907 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.060950 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.060962 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.060980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.060992 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:17Z","lastTransitionTime":"2026-01-21T15:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.099450 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.099523 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.099551 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.099584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099548 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099657 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099757 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:21.099684857 +0000 UTC m=+26.024174489 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099636 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099785 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099819 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099832 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099797 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099892 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099777 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:21.099768939 +0000 UTC m=+26.024258561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099961 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:21.099942174 +0000 UTC m=+26.024431806 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.099985 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:21.099975555 +0000 UTC m=+26.024465177 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.163602 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.163646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.163657 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.163679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.163706 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:17Z","lastTransitionTime":"2026-01-21T15:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.265426 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.265538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.265914 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.265954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.265969 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:17Z","lastTransitionTime":"2026-01-21T15:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.307186 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 11:52:15.83934301 +0000 UTC Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.368524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.368573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.368586 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.368604 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.368617 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:17Z","lastTransitionTime":"2026-01-21T15:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.382814 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.382937 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.382977 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:17 crc kubenswrapper[4773]: E0121 15:24:17.383089 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.471310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.471350 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.471360 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.471374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.471386 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:17Z","lastTransitionTime":"2026-01-21T15:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.573921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.573962 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.574002 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.574019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.574030 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:17Z","lastTransitionTime":"2026-01-21T15:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.677099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.677127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.677135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.677147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.677157 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:17Z","lastTransitionTime":"2026-01-21T15:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.780407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.780657 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.780752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.780823 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.780890 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:17Z","lastTransitionTime":"2026-01-21T15:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.883143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.883548 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.883758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.883951 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.884128 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:17Z","lastTransitionTime":"2026-01-21T15:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.986824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.986915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.986941 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.986960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:17 crc kubenswrapper[4773]: I0121 15:24:17.986975 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:17Z","lastTransitionTime":"2026-01-21T15:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.088989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.089022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.089031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.089044 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.089052 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:18Z","lastTransitionTime":"2026-01-21T15:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.192051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.192332 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.192436 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.192529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.192656 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:18Z","lastTransitionTime":"2026-01-21T15:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.295380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.295670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.295784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.295870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.295962 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:18Z","lastTransitionTime":"2026-01-21T15:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.307832 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 02:54:20.043705841 +0000 UTC Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.383067 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:18 crc kubenswrapper[4773]: E0121 15:24:18.383319 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.398261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.398310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.398321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.398337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.398349 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:18Z","lastTransitionTime":"2026-01-21T15:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.500546 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.500781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.500972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.501187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.501325 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:18Z","lastTransitionTime":"2026-01-21T15:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.604000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.604050 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.604062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.604079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.604094 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:18Z","lastTransitionTime":"2026-01-21T15:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.706098 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.706400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.706464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.706549 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.706641 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:18Z","lastTransitionTime":"2026-01-21T15:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.808946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.808998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.809014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.809037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.809054 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:18Z","lastTransitionTime":"2026-01-21T15:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.911060 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.911110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.911123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.911159 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:18 crc kubenswrapper[4773]: I0121 15:24:18.911179 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:18Z","lastTransitionTime":"2026-01-21T15:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.013621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.013719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.013733 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.013754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.013791 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:19Z","lastTransitionTime":"2026-01-21T15:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.119444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.119508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.119528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.119552 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.119569 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:19Z","lastTransitionTime":"2026-01-21T15:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.221292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.221325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.221334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.221349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.221358 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:19Z","lastTransitionTime":"2026-01-21T15:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.308646 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 06:18:03.215273969 +0000 UTC Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.323636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.323668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.323676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.323707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.323717 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:19Z","lastTransitionTime":"2026-01-21T15:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.383196 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:19 crc kubenswrapper[4773]: E0121 15:24:19.383352 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.383824 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:19 crc kubenswrapper[4773]: E0121 15:24:19.383917 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.426087 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.426331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.426405 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.426484 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.426551 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:19Z","lastTransitionTime":"2026-01-21T15:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.529288 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.529328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.529341 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.529356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.529367 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:19Z","lastTransitionTime":"2026-01-21T15:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.583199 4773 csr.go:261] certificate signing request csr-wclpv is approved, waiting to be issued Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.620200 4773 csr.go:257] certificate signing request csr-wclpv is issued Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.632283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.632308 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.632316 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.632329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.632338 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:19Z","lastTransitionTime":"2026-01-21T15:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.734188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.734235 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.734247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.734264 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.734276 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:19Z","lastTransitionTime":"2026-01-21T15:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.836748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.836786 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.836797 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.836815 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.836826 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:19Z","lastTransitionTime":"2026-01-21T15:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.939424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.939647 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.939742 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.939848 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:19 crc kubenswrapper[4773]: I0121 15:24:19.939926 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:19Z","lastTransitionTime":"2026-01-21T15:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.042097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.042141 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.042151 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.042169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.042179 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:20Z","lastTransitionTime":"2026-01-21T15:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.144940 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.144990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.145003 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.145019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.145030 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:20Z","lastTransitionTime":"2026-01-21T15:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.247764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.247805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.247817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.247834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.247847 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:20Z","lastTransitionTime":"2026-01-21T15:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.309777 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 03:27:17.276918678 +0000 UTC Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.350116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.350164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.350173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.350186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.350195 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:20Z","lastTransitionTime":"2026-01-21T15:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.383413 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:20 crc kubenswrapper[4773]: E0121 15:24:20.383536 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.432046 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-t2rrh"] Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.432377 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t2rrh" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.434778 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.435818 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.453136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.453203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.453215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.453231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.453243 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:20Z","lastTransitionTime":"2026-01-21T15:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.457718 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.477673 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.530800 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk7s9\" (UniqueName: \"kubernetes.io/projected/16f53932-e395-43b0-a347-69ada1fe11a2-kube-api-access-nk7s9\") pod \"node-resolver-t2rrh\" (UID: \"16f53932-e395-43b0-a347-69ada1fe11a2\") " pod="openshift-dns/node-resolver-t2rrh" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.530864 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/16f53932-e395-43b0-a347-69ada1fe11a2-hosts-file\") pod \"node-resolver-t2rrh\" (UID: \"16f53932-e395-43b0-a347-69ada1fe11a2\") " pod="openshift-dns/node-resolver-t2rrh" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.539399 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.555144 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.555175 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.555185 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.555198 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.555208 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:20Z","lastTransitionTime":"2026-01-21T15:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.561799 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.585367 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.602462 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.616668 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.621006 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-21 15:19:19 +0000 UTC, rotation deadline is 2026-10-21 16:31:59.663024161 +0000 UTC Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.621055 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6553h7m39.041973237s for next certificate rotation Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.631516 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/16f53932-e395-43b0-a347-69ada1fe11a2-hosts-file\") pod \"node-resolver-t2rrh\" (UID: \"16f53932-e395-43b0-a347-69ada1fe11a2\") " pod="openshift-dns/node-resolver-t2rrh" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.631585 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk7s9\" (UniqueName: \"kubernetes.io/projected/16f53932-e395-43b0-a347-69ada1fe11a2-kube-api-access-nk7s9\") pod \"node-resolver-t2rrh\" (UID: \"16f53932-e395-43b0-a347-69ada1fe11a2\") " pod="openshift-dns/node-resolver-t2rrh" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.631597 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/16f53932-e395-43b0-a347-69ada1fe11a2-hosts-file\") pod \"node-resolver-t2rrh\" (UID: \"16f53932-e395-43b0-a347-69ada1fe11a2\") " pod="openshift-dns/node-resolver-t2rrh" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.631839 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.643855 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.650535 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk7s9\" (UniqueName: \"kubernetes.io/projected/16f53932-e395-43b0-a347-69ada1fe11a2-kube-api-access-nk7s9\") pod \"node-resolver-t2rrh\" (UID: \"16f53932-e395-43b0-a347-69ada1fe11a2\") " pod="openshift-dns/node-resolver-t2rrh" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.657123 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.658033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.658079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.658092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.658110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.658121 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:20Z","lastTransitionTime":"2026-01-21T15:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.670240 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.745947 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t2rrh" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.761002 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.761048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.761060 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.761080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.761092 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:20Z","lastTransitionTime":"2026-01-21T15:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:20 crc kubenswrapper[4773]: W0121 15:24:20.761106 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f53932_e395_43b0_a347_69ada1fe11a2.slice/crio-1172c38035a93d7469648cfa7cd25ff013de3484de30335cbc2a0dbb3ee2d268 WatchSource:0}: Error finding container 1172c38035a93d7469648cfa7cd25ff013de3484de30335cbc2a0dbb3ee2d268: Status 404 returned error can't find the container with id 1172c38035a93d7469648cfa7cd25ff013de3484de30335cbc2a0dbb3ee2d268 Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.864673 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.864992 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.865002 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.865017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.865028 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:20Z","lastTransitionTime":"2026-01-21T15:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.900945 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-gc5wj"] Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.901560 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gc5wj" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.901921 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-rfzvc"] Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.904138 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6f67j"] Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.904374 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.905072 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.907186 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.907258 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.907294 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.907522 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.907544 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.907730 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.907826 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.907887 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.908191 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.908603 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.908785 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.912005 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.921107 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.932241 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.945938 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.967426 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.967463 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.967474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.967491 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.967501 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:20Z","lastTransitionTime":"2026-01-21T15:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.970054 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.984716 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:20 crc kubenswrapper[4773]: I0121 15:24:20.999774 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:20Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.012462 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.024190 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.034842 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.034949 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-run-multus-certs\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.035010 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:29.034987535 +0000 UTC m=+33.959477157 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035121 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-var-lib-kubelet\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-cnibin\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035196 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-conf-dir\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035216 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-cni-dir\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035239 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dff586d5-9d98-4ec2-afb1-e550fd4f3678-mcd-auth-proxy-config\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035269 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-daemon-config\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035290 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035314 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-var-lib-cni-multus\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035337 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dff586d5-9d98-4ec2-afb1-e550fd4f3678-rootfs\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035358 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g25hn\" (UniqueName: \"kubernetes.io/projected/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-kube-api-access-g25hn\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035372 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035406 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-socket-dir-parent\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035434 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-run-netns\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035450 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dff586d5-9d98-4ec2-afb1-e550fd4f3678-proxy-tls\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035464 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srxc5\" (UniqueName: \"kubernetes.io/projected/dff586d5-9d98-4ec2-afb1-e550fd4f3678-kube-api-access-srxc5\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035481 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-cnibin\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-var-lib-cni-bin\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035524 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-hostroot\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035547 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-etc-kubernetes\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035591 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-system-cni-dir\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035622 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035657 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-system-cni-dir\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035673 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-os-release\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035727 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-cni-binary-copy\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035747 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34d54fdd-eda0-441f-b721-0adecc20a0db-cni-binary-copy\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035763 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-run-k8s-cni-cncf-io\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035778 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c298s\" (UniqueName: \"kubernetes.io/projected/34d54fdd-eda0-441f-b721-0adecc20a0db-kube-api-access-c298s\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.035793 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-os-release\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.045613 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.054269 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.070020 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.070062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.070071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.070086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.070098 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:21Z","lastTransitionTime":"2026-01-21T15:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.072179 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.089953 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.100075 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.116494 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.130284 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137002 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-cni-binary-copy\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137048 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34d54fdd-eda0-441f-b721-0adecc20a0db-cni-binary-copy\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137070 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-run-k8s-cni-cncf-io\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137093 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c298s\" (UniqueName: \"kubernetes.io/projected/34d54fdd-eda0-441f-b721-0adecc20a0db-kube-api-access-c298s\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137156 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-os-release\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137183 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-run-multus-certs\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137206 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-var-lib-kubelet\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137249 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-cnibin\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137268 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-conf-dir\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137287 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-cni-dir\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dff586d5-9d98-4ec2-afb1-e550fd4f3678-mcd-auth-proxy-config\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137332 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-daemon-config\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137353 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137375 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-var-lib-cni-multus\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137394 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dff586d5-9d98-4ec2-afb1-e550fd4f3678-rootfs\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137416 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g25hn\" (UniqueName: \"kubernetes.io/projected/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-kube-api-access-g25hn\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137438 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137459 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-socket-dir-parent\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137477 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-run-netns\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137499 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dff586d5-9d98-4ec2-afb1-e550fd4f3678-proxy-tls\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137518 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srxc5\" (UniqueName: \"kubernetes.io/projected/dff586d5-9d98-4ec2-afb1-e550fd4f3678-kube-api-access-srxc5\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137538 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-cnibin\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137556 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-var-lib-cni-bin\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137577 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-hostroot\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137597 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-etc-kubernetes\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137618 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-system-cni-dir\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137639 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137663 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137684 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-system-cni-dir\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137728 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-os-release\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.137751 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.137868 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.137919 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:29.137903757 +0000 UTC m=+34.062393389 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.138715 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-cnibin\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.138750 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-os-release\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.138792 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-var-lib-kubelet\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.138855 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.138884 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.138899 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.138949 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:29.138930034 +0000 UTC m=+34.063419746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.138980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-cni-binary-copy\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139006 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-run-multus-certs\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.139047 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139051 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-hostroot\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.139081 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:29.139070728 +0000 UTC m=+34.063560360 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139102 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-socket-dir-parent\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139118 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-etc-kubernetes\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139140 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-run-netns\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139157 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-system-cni-dir\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139459 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34d54fdd-eda0-441f-b721-0adecc20a0db-cni-binary-copy\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139504 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-run-k8s-cni-cncf-io\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139557 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-cnibin\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-var-lib-cni-multus\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139750 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-system-cni-dir\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-host-var-lib-cni-bin\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139828 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-conf-dir\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139821 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dff586d5-9d98-4ec2-afb1-e550fd4f3678-rootfs\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139800 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-os-release\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.139869 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.139886 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.139897 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.139746 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-cni-dir\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.139926 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:29.139918359 +0000 UTC m=+34.064407981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.140010 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.140070 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/34d54fdd-eda0-441f-b721-0adecc20a0db-multus-daemon-config\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.140409 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dff586d5-9d98-4ec2-afb1-e550fd4f3678-mcd-auth-proxy-config\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.142537 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.143205 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dff586d5-9d98-4ec2-afb1-e550fd4f3678-proxy-tls\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.155084 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c298s\" (UniqueName: \"kubernetes.io/projected/34d54fdd-eda0-441f-b721-0adecc20a0db-kube-api-access-c298s\") pod \"multus-gc5wj\" (UID: \"34d54fdd-eda0-441f-b721-0adecc20a0db\") " pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.157457 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srxc5\" (UniqueName: \"kubernetes.io/projected/dff586d5-9d98-4ec2-afb1-e550fd4f3678-kube-api-access-srxc5\") pod \"machine-config-daemon-rfzvc\" (UID: \"dff586d5-9d98-4ec2-afb1-e550fd4f3678\") " pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.161103 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.163427 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g25hn\" (UniqueName: \"kubernetes.io/projected/7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2-kube-api-access-g25hn\") pod \"multus-additional-cni-plugins-6f67j\" (UID: \"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\") " pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.172320 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.173437 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.173460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.173470 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.173485 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.173495 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:21Z","lastTransitionTime":"2026-01-21T15:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.185057 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.200080 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.211896 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.223022 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.234144 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.241372 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gc5wj" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.250380 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:24:21 crc kubenswrapper[4773]: W0121 15:24:21.252288 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34d54fdd_eda0_441f_b721_0adecc20a0db.slice/crio-47ba03a0eded2ad425f6d6ba563605e8b576a4dd86d81369e4f1a22e5aa93546 WatchSource:0}: Error finding container 47ba03a0eded2ad425f6d6ba563605e8b576a4dd86d81369e4f1a22e5aa93546: Status 404 returned error can't find the container with id 47ba03a0eded2ad425f6d6ba563605e8b576a4dd86d81369e4f1a22e5aa93546 Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.257317 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6f67j" Jan 21 15:24:21 crc kubenswrapper[4773]: W0121 15:24:21.262147 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff586d5_9d98_4ec2_afb1_e550fd4f3678.slice/crio-23eae9e629e8f7c397d84b71aa5ea71c8b500f1636e05fc17b63f69bc50fc5fc WatchSource:0}: Error finding container 23eae9e629e8f7c397d84b71aa5ea71c8b500f1636e05fc17b63f69bc50fc5fc: Status 404 returned error can't find the container with id 23eae9e629e8f7c397d84b71aa5ea71c8b500f1636e05fc17b63f69bc50fc5fc Jan 21 15:24:21 crc kubenswrapper[4773]: W0121 15:24:21.273177 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dcf05b0_62cc_4b1d_a21f_3dca696bf8c2.slice/crio-8bb78faf207980920019ad4c29214e139c2f37275f737214e7cd6637caf3ca23 WatchSource:0}: Error finding container 8bb78faf207980920019ad4c29214e139c2f37275f737214e7cd6637caf3ca23: Status 404 returned error can't find the container with id 8bb78faf207980920019ad4c29214e139c2f37275f737214e7cd6637caf3ca23 Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.276896 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-94hkt"] Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.277394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.277418 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.277428 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.277444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.277457 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:21Z","lastTransitionTime":"2026-01-21T15:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.277754 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.281776 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.281822 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.282154 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.282313 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.282383 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.282564 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.282896 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.292595 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.304214 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.312273 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 22:00:55.559978043 +0000 UTC Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.317599 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.336406 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.348424 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.361260 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.377220 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.379817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.379850 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.379861 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.379877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.379889 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:21Z","lastTransitionTime":"2026-01-21T15:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.382831 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.382947 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.383473 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:21 crc kubenswrapper[4773]: E0121 15:24:21.383603 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.396403 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.407995 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.419126 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.433420 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.440864 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-script-lib\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.440909 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9drtp\" (UniqueName: \"kubernetes.io/projected/2d23a5a4-6787-45a5-9664-20318156f46f-kube-api-access-9drtp\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.440930 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.440966 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-log-socket\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.440989 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-netns\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441007 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-var-lib-openvswitch\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441024 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-ovn\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441047 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-node-log\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-env-overrides\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441083 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-kubelet\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441100 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-openvswitch\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441122 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-ovn-kubernetes\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441160 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d23a5a4-6787-45a5-9664-20318156f46f-ovn-node-metrics-cert\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441178 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-bin\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441204 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-netd\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441220 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-systemd-units\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441238 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-systemd\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441255 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-config\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441424 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-slash\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.441458 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-etc-openvswitch\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.446532 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.458339 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.473873 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.481783 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.481811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.481819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.481831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.481840 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:21Z","lastTransitionTime":"2026-01-21T15:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.508091 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.508142 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.508154 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"23eae9e629e8f7c397d84b71aa5ea71c8b500f1636e05fc17b63f69bc50fc5fc"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.509218 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gc5wj" event={"ID":"34d54fdd-eda0-441f-b721-0adecc20a0db","Type":"ContainerStarted","Data":"e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.509242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gc5wj" event={"ID":"34d54fdd-eda0-441f-b721-0adecc20a0db","Type":"ContainerStarted","Data":"47ba03a0eded2ad425f6d6ba563605e8b576a4dd86d81369e4f1a22e5aa93546"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.510453 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t2rrh" event={"ID":"16f53932-e395-43b0-a347-69ada1fe11a2","Type":"ContainerStarted","Data":"c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.510502 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t2rrh" event={"ID":"16f53932-e395-43b0-a347-69ada1fe11a2","Type":"ContainerStarted","Data":"1172c38035a93d7469648cfa7cd25ff013de3484de30335cbc2a0dbb3ee2d268"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.511764 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" event={"ID":"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2","Type":"ContainerStarted","Data":"f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.511813 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" event={"ID":"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2","Type":"ContainerStarted","Data":"8bb78faf207980920019ad4c29214e139c2f37275f737214e7cd6637caf3ca23"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.522402 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.534749 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542048 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-openvswitch\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542091 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-ovn-kubernetes\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542108 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d23a5a4-6787-45a5-9664-20318156f46f-ovn-node-metrics-cert\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542125 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-bin\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542143 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-netd\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542161 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-systemd-units\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542175 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-systemd\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542189 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-config\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542216 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-netd\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542251 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-bin\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542264 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-systemd-units\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542212 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-openvswitch\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542292 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-systemd\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542313 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-slash\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542341 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-slash\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542343 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-etc-openvswitch\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542370 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-script-lib\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542288 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-ovn-kubernetes\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542391 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9drtp\" (UniqueName: \"kubernetes.io/projected/2d23a5a4-6787-45a5-9664-20318156f46f-kube-api-access-9drtp\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542413 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542449 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-log-socket\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542469 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-netns\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542492 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-var-lib-openvswitch\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542512 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-ovn\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542544 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-node-log\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-env-overrides\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542588 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-kubelet\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542602 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-netns\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542640 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-kubelet\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-var-lib-openvswitch\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542665 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-ovn\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542682 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-etc-openvswitch\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542724 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-node-log\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.542741 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-log-socket\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.543031 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-config\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.543216 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-script-lib\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.543398 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-env-overrides\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.545021 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d23a5a4-6787-45a5-9664-20318156f46f-ovn-node-metrics-cert\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.546766 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.558444 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.562166 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9drtp\" (UniqueName: \"kubernetes.io/projected/2d23a5a4-6787-45a5-9664-20318156f46f-kube-api-access-9drtp\") pod \"ovnkube-node-94hkt\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.572456 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.583621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.583668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.583677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.583702 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.583713 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:21Z","lastTransitionTime":"2026-01-21T15:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.585826 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.592880 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.596325 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.609091 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.627532 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.659342 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: W0121 15:24:21.663978 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d23a5a4_6787_45a5_9664_20318156f46f.slice/crio-6ad5ea8c47d6b6b6f3e077d79aa034c1ff976d8214d7a3c612155defef9eb3a4 WatchSource:0}: Error finding container 6ad5ea8c47d6b6b6f3e077d79aa034c1ff976d8214d7a3c612155defef9eb3a4: Status 404 returned error can't find the container with id 6ad5ea8c47d6b6b6f3e077d79aa034c1ff976d8214d7a3c612155defef9eb3a4 Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.678531 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.687348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.687378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.687386 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.687399 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.687409 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:21Z","lastTransitionTime":"2026-01-21T15:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.696266 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.708664 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.721934 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.744573 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.756724 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.767999 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.784111 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.789051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.789085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.789094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.789109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.789117 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:21Z","lastTransitionTime":"2026-01-21T15:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.796906 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.812002 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.824114 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.836285 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.845634 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.864631 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.891835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.892062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.892142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.892255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.892332 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:21Z","lastTransitionTime":"2026-01-21T15:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.901292 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.943282 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.985306 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:21Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.994544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.994570 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.994578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.994591 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:21 crc kubenswrapper[4773]: I0121 15:24:21.994600 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:21Z","lastTransitionTime":"2026-01-21T15:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.028033 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.097366 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.097403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.097412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.097429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.097439 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:22Z","lastTransitionTime":"2026-01-21T15:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.200625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.200829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.200934 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.201029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.201119 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:22Z","lastTransitionTime":"2026-01-21T15:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.304009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.304323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.304336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.304350 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.304360 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:22Z","lastTransitionTime":"2026-01-21T15:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.313223 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 12:08:10.043347161 +0000 UTC Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.383490 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:22 crc kubenswrapper[4773]: E0121 15:24:22.383615 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.406744 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.406786 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.406800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.406817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.406828 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:22Z","lastTransitionTime":"2026-01-21T15:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.509265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.509316 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.509328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.509344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.509356 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:22Z","lastTransitionTime":"2026-01-21T15:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.516480 4773 generic.go:334] "Generic (PLEG): container finished" podID="7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2" containerID="f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567" exitCode=0 Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.516573 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" event={"ID":"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2","Type":"ContainerDied","Data":"f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.518371 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8" exitCode=0 Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.518418 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.518450 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"6ad5ea8c47d6b6b6f3e077d79aa034c1ff976d8214d7a3c612155defef9eb3a4"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.539158 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.550685 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.564902 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.578234 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.590370 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.601866 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.612977 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.613009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.613019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.613034 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.613047 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:22Z","lastTransitionTime":"2026-01-21T15:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.627018 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.638605 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.649810 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.666047 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.681347 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.701387 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.713613 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.715144 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.715197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.715207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.715220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.715247 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:22Z","lastTransitionTime":"2026-01-21T15:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.728369 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.740156 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.751110 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.762108 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.771481 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.782454 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.817242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.817283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.817294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.817313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.817325 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:22Z","lastTransitionTime":"2026-01-21T15:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.823045 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.862126 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.903204 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.918959 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.918991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.919002 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.919017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.919028 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:22Z","lastTransitionTime":"2026-01-21T15:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.956664 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:22 crc kubenswrapper[4773]: I0121 15:24:22.990947 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:22Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.020878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.020916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.020924 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.020943 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.020952 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:23Z","lastTransitionTime":"2026-01-21T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.023433 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.062474 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.106829 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.123466 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.123505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.123512 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.123527 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.123535 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:23Z","lastTransitionTime":"2026-01-21T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.149534 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.226051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.226084 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.226092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.226105 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.226117 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:23Z","lastTransitionTime":"2026-01-21T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.313816 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 08:49:55.491752938 +0000 UTC Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.329101 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.329142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.329153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.329169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.329182 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:23Z","lastTransitionTime":"2026-01-21T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.383868 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:23 crc kubenswrapper[4773]: E0121 15:24:23.384003 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.384533 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:23 crc kubenswrapper[4773]: E0121 15:24:23.384614 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.431224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.431260 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.431271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.431286 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.431297 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:23Z","lastTransitionTime":"2026-01-21T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.522627 4773 generic.go:334] "Generic (PLEG): container finished" podID="7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2" containerID="d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef" exitCode=0 Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.522711 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" event={"ID":"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2","Type":"ContainerDied","Data":"d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.529830 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.529914 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.529924 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.529933 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.529970 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.536873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.536907 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.536918 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.536933 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.536944 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:23Z","lastTransitionTime":"2026-01-21T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.541653 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.553578 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.568055 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.581182 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.594868 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.608918 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.619421 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.632032 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.644000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.644064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.644079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.644096 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.644109 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:23Z","lastTransitionTime":"2026-01-21T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.649570 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.669439 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.682615 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.695688 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.707957 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.720463 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:23Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.746810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.746841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.746850 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.746865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.746874 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:23Z","lastTransitionTime":"2026-01-21T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.850102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.850140 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.850151 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.850181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.850194 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:23Z","lastTransitionTime":"2026-01-21T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.953037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.953070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.953078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.953100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:23 crc kubenswrapper[4773]: I0121 15:24:23.953109 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:23Z","lastTransitionTime":"2026-01-21T15:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.055632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.055661 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.055669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.055682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.055718 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:24Z","lastTransitionTime":"2026-01-21T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.158175 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.158212 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.158222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.158236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.158248 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:24Z","lastTransitionTime":"2026-01-21T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.260615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.260663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.260675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.260710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.260720 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:24Z","lastTransitionTime":"2026-01-21T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.314850 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 09:22:14.299353464 +0000 UTC Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.363163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.363201 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.363211 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.363229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.363249 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:24Z","lastTransitionTime":"2026-01-21T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.382775 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:24 crc kubenswrapper[4773]: E0121 15:24:24.383010 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.465282 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.465332 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.465344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.465363 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.465375 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:24Z","lastTransitionTime":"2026-01-21T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.534998 4773 generic.go:334] "Generic (PLEG): container finished" podID="7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2" containerID="8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc" exitCode=0 Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.535087 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" event={"ID":"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2","Type":"ContainerDied","Data":"8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.540674 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.556926 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.567357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.567396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.567407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.567424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.567436 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:24Z","lastTransitionTime":"2026-01-21T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.572780 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.583922 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.597229 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.617848 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.636382 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.647950 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.659840 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.670002 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.670047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.670058 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.670072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.670082 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:24Z","lastTransitionTime":"2026-01-21T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.672446 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.683895 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.698408 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.711101 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.721688 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.729801 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:24Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.772578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.772618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.772629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.772645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.772655 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:24Z","lastTransitionTime":"2026-01-21T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.875263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.875305 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.875316 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.875332 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.875342 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:24Z","lastTransitionTime":"2026-01-21T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.977641 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.977680 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.977694 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.977724 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:24 crc kubenswrapper[4773]: I0121 15:24:24.977736 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:24Z","lastTransitionTime":"2026-01-21T15:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.080325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.080360 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.080369 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.080383 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.080393 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:25Z","lastTransitionTime":"2026-01-21T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.182500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.182535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.182543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.182556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.182565 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:25Z","lastTransitionTime":"2026-01-21T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.227520 4773 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.284529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.284557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.284565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.284577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.284593 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:25Z","lastTransitionTime":"2026-01-21T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.315589 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 13:58:29.567977382 +0000 UTC Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.383381 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.383390 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:25 crc kubenswrapper[4773]: E0121 15:24:25.383513 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:25 crc kubenswrapper[4773]: E0121 15:24:25.383603 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.387097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.387129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.387137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.387149 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.387159 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:25Z","lastTransitionTime":"2026-01-21T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.397430 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.413381 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.427368 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.446672 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.475391 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.489596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.489665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.489679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.489705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.489746 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:25Z","lastTransitionTime":"2026-01-21T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.499110 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.514421 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.530237 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.547306 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.547992 4773 generic.go:334] "Generic (PLEG): container finished" podID="7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2" containerID="efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414" exitCode=0 Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.548079 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" event={"ID":"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2","Type":"ContainerDied","Data":"efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414"} Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.562856 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.588911 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.592663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.592697 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.592710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.592770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.592786 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:25Z","lastTransitionTime":"2026-01-21T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.602449 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.616118 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.625934 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.636273 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.649124 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.662098 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.678906 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.679531 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-mnvdf"] Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.680664 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mnvdf" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.682611 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.682629 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.683046 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.684671 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.694751 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.695844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.695959 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.695976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.696309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.696328 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:25Z","lastTransitionTime":"2026-01-21T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.730179 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.755101 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.768780 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.780073 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.781409 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb8f1575-e899-4d82-ad55-696e10474bf8-host\") pod \"node-ca-mnvdf\" (UID: \"fb8f1575-e899-4d82-ad55-696e10474bf8\") " pod="openshift-image-registry/node-ca-mnvdf" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.781666 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2hdn\" (UniqueName: \"kubernetes.io/projected/fb8f1575-e899-4d82-ad55-696e10474bf8-kube-api-access-x2hdn\") pod \"node-ca-mnvdf\" (UID: \"fb8f1575-e899-4d82-ad55-696e10474bf8\") " pod="openshift-image-registry/node-ca-mnvdf" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.781790 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb8f1575-e899-4d82-ad55-696e10474bf8-serviceca\") pod \"node-ca-mnvdf\" (UID: \"fb8f1575-e899-4d82-ad55-696e10474bf8\") " pod="openshift-image-registry/node-ca-mnvdf" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.798841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.798875 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.798884 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.798897 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.798906 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:25Z","lastTransitionTime":"2026-01-21T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.806896 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.825263 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.838031 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.850255 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.860394 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.878562 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.882396 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2hdn\" (UniqueName: \"kubernetes.io/projected/fb8f1575-e899-4d82-ad55-696e10474bf8-kube-api-access-x2hdn\") pod \"node-ca-mnvdf\" (UID: \"fb8f1575-e899-4d82-ad55-696e10474bf8\") " pod="openshift-image-registry/node-ca-mnvdf" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.882443 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb8f1575-e899-4d82-ad55-696e10474bf8-serviceca\") pod \"node-ca-mnvdf\" (UID: \"fb8f1575-e899-4d82-ad55-696e10474bf8\") " pod="openshift-image-registry/node-ca-mnvdf" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.882476 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb8f1575-e899-4d82-ad55-696e10474bf8-host\") pod \"node-ca-mnvdf\" (UID: \"fb8f1575-e899-4d82-ad55-696e10474bf8\") " pod="openshift-image-registry/node-ca-mnvdf" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.882524 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb8f1575-e899-4d82-ad55-696e10474bf8-host\") pod \"node-ca-mnvdf\" (UID: \"fb8f1575-e899-4d82-ad55-696e10474bf8\") " pod="openshift-image-registry/node-ca-mnvdf" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.883416 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb8f1575-e899-4d82-ad55-696e10474bf8-serviceca\") pod \"node-ca-mnvdf\" (UID: \"fb8f1575-e899-4d82-ad55-696e10474bf8\") " pod="openshift-image-registry/node-ca-mnvdf" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.890596 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.900937 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.901247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.901257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.901271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.901282 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:25Z","lastTransitionTime":"2026-01-21T15:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.901702 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.908912 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2hdn\" (UniqueName: \"kubernetes.io/projected/fb8f1575-e899-4d82-ad55-696e10474bf8-kube-api-access-x2hdn\") pod \"node-ca-mnvdf\" (UID: \"fb8f1575-e899-4d82-ad55-696e10474bf8\") " pod="openshift-image-registry/node-ca-mnvdf" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.919377 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.930121 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.942553 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.964846 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.976253 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:25 crc kubenswrapper[4773]: I0121 15:24:25.988510 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:25Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.002314 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.004002 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.004040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.004050 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.004066 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.004082 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.014000 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.023860 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mnvdf" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.034507 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: W0121 15:24:26.037349 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb8f1575_e899_4d82_ad55_696e10474bf8.slice/crio-24d0e53a3459ad8b408bcead3384fcd74c0deaab8b618dbb82ba183181bf2157 WatchSource:0}: Error finding container 24d0e53a3459ad8b408bcead3384fcd74c0deaab8b618dbb82ba183181bf2157: Status 404 returned error can't find the container with id 24d0e53a3459ad8b408bcead3384fcd74c0deaab8b618dbb82ba183181bf2157 Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.053489 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.067646 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.082089 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.106412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.106482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.106495 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.106512 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.106520 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.208939 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.208989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.209003 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.209020 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.209032 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.310936 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.310969 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.310976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.310989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.310998 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.315685 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 14:50:59.377074281 +0000 UTC Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.383319 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:26 crc kubenswrapper[4773]: E0121 15:24:26.383462 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.404752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.404794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.404802 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.404817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.404826 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: E0121 15:24:26.420328 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.424492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.424525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.424535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.424552 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.424563 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: E0121 15:24:26.436011 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.439045 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.439099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.439116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.439136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.439151 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: E0121 15:24:26.450971 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.453963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.453988 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.453996 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.454008 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.454017 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: E0121 15:24:26.464151 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.467348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.467406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.467422 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.467438 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.467822 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: E0121 15:24:26.479168 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: E0121 15:24:26.479278 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.480748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.480774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.480785 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.480797 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.480818 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.552892 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mnvdf" event={"ID":"fb8f1575-e899-4d82-ad55-696e10474bf8","Type":"ContainerStarted","Data":"defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.552945 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mnvdf" event={"ID":"fb8f1575-e899-4d82-ad55-696e10474bf8","Type":"ContainerStarted","Data":"24d0e53a3459ad8b408bcead3384fcd74c0deaab8b618dbb82ba183181bf2157"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.556419 4773 generic.go:334] "Generic (PLEG): container finished" podID="7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2" containerID="a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346" exitCode=0 Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.556451 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" event={"ID":"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2","Type":"ContainerDied","Data":"a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.559668 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.565645 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.578025 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.582936 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.582982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.582995 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.583011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.583022 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.597125 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.615257 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.628892 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.644756 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.658261 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.667562 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.679856 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.686248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.686330 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.686342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.686358 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.686399 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.691657 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.702678 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.716509 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.736014 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.746236 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.757600 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.767809 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.778954 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.790092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.790135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.790143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.790157 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.790167 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.802976 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.847434 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.887394 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.891898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.891929 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.891939 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.891955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.891966 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.921960 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.968571 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:26Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.994748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.994792 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.994801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.994816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:26 crc kubenswrapper[4773]: I0121 15:24:26.994827 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:26Z","lastTransitionTime":"2026-01-21T15:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.001857 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.042604 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.082234 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.097092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.097146 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.097168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.097194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.097256 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:27Z","lastTransitionTime":"2026-01-21T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.122537 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.163937 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.198861 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.198898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.198907 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.198921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.198930 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:27Z","lastTransitionTime":"2026-01-21T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.203147 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.242511 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.282148 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.301294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.301340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.301352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.301369 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.301381 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:27Z","lastTransitionTime":"2026-01-21T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.316823 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 01:48:30.211054761 +0000 UTC Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.383415 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.383422 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:27 crc kubenswrapper[4773]: E0121 15:24:27.383551 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:27 crc kubenswrapper[4773]: E0121 15:24:27.383681 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.403473 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.403506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.403518 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.403535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.403545 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:27Z","lastTransitionTime":"2026-01-21T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.505470 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.505503 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.505514 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.505530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.505540 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:27Z","lastTransitionTime":"2026-01-21T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.565590 4773 generic.go:334] "Generic (PLEG): container finished" podID="7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2" containerID="7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f" exitCode=0 Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.566012 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" event={"ID":"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2","Type":"ContainerDied","Data":"7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f"} Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.576040 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.588696 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.602665 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.606956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.606997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.607008 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.607023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.607035 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:27Z","lastTransitionTime":"2026-01-21T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.615183 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.628119 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.643911 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.663141 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.673586 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.684676 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.696541 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.709141 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.709178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.709189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.709204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.709217 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:27Z","lastTransitionTime":"2026-01-21T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.721777 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.765663 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.805473 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.812344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.812372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.812384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.812398 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.812409 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:27Z","lastTransitionTime":"2026-01-21T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.843609 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.881938 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:27Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.914803 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.914842 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.914854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.914871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:27 crc kubenswrapper[4773]: I0121 15:24:27.914883 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:27Z","lastTransitionTime":"2026-01-21T15:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.018670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.018746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.018760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.018784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.018802 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:28Z","lastTransitionTime":"2026-01-21T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.120652 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.120687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.120717 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.120735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.120749 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:28Z","lastTransitionTime":"2026-01-21T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.223841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.223877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.223887 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.223902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.223913 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:28Z","lastTransitionTime":"2026-01-21T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.317885 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 10:38:39.79653727 +0000 UTC Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.326666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.326727 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.326740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.326757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.326769 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:28Z","lastTransitionTime":"2026-01-21T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.382935 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:28 crc kubenswrapper[4773]: E0121 15:24:28.383094 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.428725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.428766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.428776 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.428792 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.428802 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:28Z","lastTransitionTime":"2026-01-21T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.530429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.530461 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.530471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.530486 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.530497 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:28Z","lastTransitionTime":"2026-01-21T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.572274 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.572556 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.575718 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" event={"ID":"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2","Type":"ContainerStarted","Data":"c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.586580 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.592418 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.600807 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.613539 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.625976 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.632617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.632644 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.632652 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.632666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.632675 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:28Z","lastTransitionTime":"2026-01-21T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.636890 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.646296 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.659660 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.671742 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.684225 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.696036 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.714740 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.724224 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.734508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.734550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.734562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.734581 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.734594 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:28Z","lastTransitionTime":"2026-01-21T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.740886 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.751462 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.763658 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.780311 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.791257 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.802269 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.814949 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.827454 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.836723 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.836756 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.836766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.836781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.836791 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:28Z","lastTransitionTime":"2026-01-21T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.842303 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.859394 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.870956 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.879867 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.890633 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.921443 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.938594 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.938636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.938648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.938664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.938674 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:28Z","lastTransitionTime":"2026-01-21T15:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:28 crc kubenswrapper[4773]: I0121 15:24:28.962535 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.000895 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:28Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.041389 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.041424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.041432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.041448 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.041467 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:29Z","lastTransitionTime":"2026-01-21T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.042221 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.081257 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.111645 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.111852 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:24:45.111837253 +0000 UTC m=+50.036326875 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.143602 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.143633 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.143649 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.143668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.143678 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:29Z","lastTransitionTime":"2026-01-21T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.213026 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.213079 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.213107 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.213141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213201 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213266 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:45.213250517 +0000 UTC m=+50.137740129 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213291 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213312 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213325 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213373 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:45.21335713 +0000 UTC m=+50.137846832 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213430 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213442 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213450 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213477 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:45.213469093 +0000 UTC m=+50.137958815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213529 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.213558 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:45.213550285 +0000 UTC m=+50.138040037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.246145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.246201 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.246210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.246225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.246234 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:29Z","lastTransitionTime":"2026-01-21T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.318132 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 04:39:06.124167441 +0000 UTC Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.349001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.349083 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.349098 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.349123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.349137 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:29Z","lastTransitionTime":"2026-01-21T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.383501 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.383646 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.384031 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:29 crc kubenswrapper[4773]: E0121 15:24:29.384098 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.451346 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.451414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.451478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.451513 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.451582 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:29Z","lastTransitionTime":"2026-01-21T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.553739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.553776 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.553783 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.553796 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.553804 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:29Z","lastTransitionTime":"2026-01-21T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.577811 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.578230 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.597899 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.609356 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.621918 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.633500 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.642736 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.653681 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.656312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.656347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.656357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.656371 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.656381 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:29Z","lastTransitionTime":"2026-01-21T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.666757 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.690901 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.703513 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.717094 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.730889 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.747271 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.759039 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.759070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.759080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.759095 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.759106 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:29Z","lastTransitionTime":"2026-01-21T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.762152 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.780856 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.793417 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.809445 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:29Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.861666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.861724 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.861737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.861752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.861762 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:29Z","lastTransitionTime":"2026-01-21T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.963531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.963558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.963566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.963578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:29 crc kubenswrapper[4773]: I0121 15:24:29.963586 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:29Z","lastTransitionTime":"2026-01-21T15:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.066036 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.066074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.066086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.066099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.066109 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:30Z","lastTransitionTime":"2026-01-21T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.168566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.168613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.168626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.168648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.168659 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:30Z","lastTransitionTime":"2026-01-21T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.271452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.271492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.271502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.271549 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.271559 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:30Z","lastTransitionTime":"2026-01-21T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.318997 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 00:50:26.339695109 +0000 UTC Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.373947 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.373993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.374005 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.374020 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.374031 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:30Z","lastTransitionTime":"2026-01-21T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.383277 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:30 crc kubenswrapper[4773]: E0121 15:24:30.383464 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.476751 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.476787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.476796 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.476809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.476817 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:30Z","lastTransitionTime":"2026-01-21T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.578573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.578740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.578800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.578814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.578823 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:30Z","lastTransitionTime":"2026-01-21T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.581927 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovnkube-controller/0.log" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.584888 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737" exitCode=1 Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.584990 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.585511 4773 scope.go:117] "RemoveContainer" containerID="f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.599625 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.616901 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.628867 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.637588 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.654980 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"message\\\":\\\" 6090 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0121 15:24:29.951568 6090 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:29.951601 6090 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:29.951642 6090 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:29.951669 6090 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:29.951675 6090 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:29.951687 6090 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:29.951695 6090 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:29.951713 6090 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0121 15:24:29.951732 6090 factory.go:656] Stopping watch factory\\\\nI0121 15:24:29.951737 6090 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:29.951748 6090 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:29.951752 6090 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:29.951757 6090 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 15:24:29.951758 6090 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:29.951762 6090 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.657856 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.664523 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.674306 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.680853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.680895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.680906 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.680923 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.680934 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:30Z","lastTransitionTime":"2026-01-21T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.687567 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.697829 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.711279 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.759759 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.780492 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.782757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.782783 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.782793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.782809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.782820 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:30Z","lastTransitionTime":"2026-01-21T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.794230 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.806086 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.817308 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.829565 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.841565 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.853250 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.863790 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.874914 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.885045 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.885083 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.885092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.885109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.885118 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:30Z","lastTransitionTime":"2026-01-21T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.888170 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.899392 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.915910 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.936428 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"message\\\":\\\" 6090 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0121 15:24:29.951568 6090 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:29.951601 6090 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:29.951642 6090 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:29.951669 6090 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:29.951675 6090 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:29.951687 6090 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:29.951695 6090 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:29.951713 6090 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0121 15:24:29.951732 6090 factory.go:656] Stopping watch factory\\\\nI0121 15:24:29.951737 6090 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:29.951748 6090 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:29.951752 6090 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:29.951757 6090 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 15:24:29.951758 6090 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:29.951762 6090 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.954109 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.973170 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.984761 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.987545 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.987577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.987587 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.987600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.987610 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:30Z","lastTransitionTime":"2026-01-21T15:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:30 crc kubenswrapper[4773]: I0121 15:24:30.997478 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:30Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.008723 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.019642 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.089807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.089844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.089852 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.089866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.089874 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:31Z","lastTransitionTime":"2026-01-21T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.192529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.192577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.192589 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.192612 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.192625 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:31Z","lastTransitionTime":"2026-01-21T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.294413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.294455 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.294466 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.294482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.294493 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:31Z","lastTransitionTime":"2026-01-21T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.319550 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 05:07:04.962453745 +0000 UTC Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.383436 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:31 crc kubenswrapper[4773]: E0121 15:24:31.383573 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.383985 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:31 crc kubenswrapper[4773]: E0121 15:24:31.384136 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.396356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.396403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.396415 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.396431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.396442 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:31Z","lastTransitionTime":"2026-01-21T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.499207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.499245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.499255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.499270 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.499279 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:31Z","lastTransitionTime":"2026-01-21T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.592368 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovnkube-controller/0.log" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.595454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596"} Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.595621 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.601324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.601367 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.601379 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.601397 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.601411 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:31Z","lastTransitionTime":"2026-01-21T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.613498 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.625355 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.635814 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.649950 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.661521 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.679914 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.694559 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.704372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.704575 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.704640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.704741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.704835 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:31Z","lastTransitionTime":"2026-01-21T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.716232 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"message\\\":\\\" 6090 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0121 15:24:29.951568 6090 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:29.951601 6090 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:29.951642 6090 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:29.951669 6090 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:29.951675 6090 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:29.951687 6090 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:29.951695 6090 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:29.951713 6090 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0121 15:24:29.951732 6090 factory.go:656] Stopping watch factory\\\\nI0121 15:24:29.951737 6090 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:29.951748 6090 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:29.951752 6090 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:29.951757 6090 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 15:24:29.951758 6090 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:29.951762 6090 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.729310 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.743831 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.758561 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.774501 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.791760 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.808545 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.808607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.808623 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.808644 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.808656 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:31Z","lastTransitionTime":"2026-01-21T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.819943 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.834823 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:31Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.912147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.912213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.912226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.912242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:31 crc kubenswrapper[4773]: I0121 15:24:31.912254 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:31Z","lastTransitionTime":"2026-01-21T15:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.015339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.015416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.015431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.015447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.015472 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:32Z","lastTransitionTime":"2026-01-21T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.118527 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.118635 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.118671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.118742 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.118763 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:32Z","lastTransitionTime":"2026-01-21T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.222039 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.222086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.222098 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.222114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.222126 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:32Z","lastTransitionTime":"2026-01-21T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.320615 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 05:51:13.509733128 +0000 UTC Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.324842 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.324889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.324901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.324920 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.324932 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:32Z","lastTransitionTime":"2026-01-21T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.383784 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:32 crc kubenswrapper[4773]: E0121 15:24:32.384169 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.428060 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.428155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.428188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.428241 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.428266 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:32Z","lastTransitionTime":"2026-01-21T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.531181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.531216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.531224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.531239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.531248 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:32Z","lastTransitionTime":"2026-01-21T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.598878 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovnkube-controller/1.log" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.599566 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovnkube-controller/0.log" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.601810 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596" exitCode=1 Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.601843 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.601888 4773 scope.go:117] "RemoveContainer" containerID="f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.602512 4773 scope.go:117] "RemoveContainer" containerID="bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596" Jan 21 15:24:32 crc kubenswrapper[4773]: E0121 15:24:32.602663 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.613482 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.625716 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.633488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.633514 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.633522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.633535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.633545 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:32Z","lastTransitionTime":"2026-01-21T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.638077 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.647456 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.658614 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.675496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"message\\\":\\\" 6090 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0121 15:24:29.951568 6090 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:29.951601 6090 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:29.951642 6090 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:29.951669 6090 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:29.951675 6090 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:29.951687 6090 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:29.951695 6090 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:29.951713 6090 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0121 15:24:29.951732 6090 factory.go:656] Stopping watch factory\\\\nI0121 15:24:29.951737 6090 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:29.951748 6090 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:29.951752 6090 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:29.951757 6090 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 15:24:29.951758 6090 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:29.951762 6090 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:24:31.321823 6213 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:31.321843 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:31.321848 6213 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:31.321866 6213 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:31.321870 6213 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:31.321893 6213 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:31.321926 6213 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:31.321942 6213 factory.go:656] Stopping watch factory\\\\nI0121 15:24:31.321955 6213 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:31.321980 6213 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 15:24:31.321990 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:31.321997 6213 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:31.322002 6213 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:31.322008 6213 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:31.322013 6213 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.685311 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.697574 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.708634 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.718463 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.729821 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.735735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.735774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.735785 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.735799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.735809 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:32Z","lastTransitionTime":"2026-01-21T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.749253 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.761654 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.774594 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.786539 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.837886 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.837929 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.837937 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.837953 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.837963 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:32Z","lastTransitionTime":"2026-01-21T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.940186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.940242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.940253 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.940269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.940279 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:32Z","lastTransitionTime":"2026-01-21T15:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.980806 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf"] Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.981312 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.982716 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.983608 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 21 15:24:32 crc kubenswrapper[4773]: I0121 15:24:32.993724 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:32Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.005074 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.021818 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.033848 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.042420 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.042457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.042475 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.042490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.042503 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:33Z","lastTransitionTime":"2026-01-21T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.045417 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.046572 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.046607 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.046621 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6fgn\" (UniqueName: \"kubernetes.io/projected/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-kube-api-access-d6fgn\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.046638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.056828 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.066103 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.074276 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.083472 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.097425 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.110366 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.119720 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.135641 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.144430 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.144476 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.144493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.144513 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.144529 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:33Z","lastTransitionTime":"2026-01-21T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.148047 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.148115 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.148147 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6fgn\" (UniqueName: \"kubernetes.io/projected/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-kube-api-access-d6fgn\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.148182 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.148730 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.148969 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.153563 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.155860 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f44bb52aa0e28bdd1a00b663da7b6b66b7d7279739195b019827463857996737\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"message\\\":\\\" 6090 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0121 15:24:29.951568 6090 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:29.951601 6090 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:29.951642 6090 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:29.951669 6090 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:29.951675 6090 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:29.951687 6090 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:29.951695 6090 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:29.951713 6090 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0121 15:24:29.951732 6090 factory.go:656] Stopping watch factory\\\\nI0121 15:24:29.951737 6090 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:29.951748 6090 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:29.951752 6090 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:29.951757 6090 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 15:24:29.951758 6090 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:29.951762 6090 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:24:31.321823 6213 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:31.321843 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:31.321848 6213 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:31.321866 6213 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:31.321870 6213 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:31.321893 6213 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:31.321926 6213 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:31.321942 6213 factory.go:656] Stopping watch factory\\\\nI0121 15:24:31.321955 6213 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:31.321980 6213 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 15:24:31.321990 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:31.321997 6213 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:31.322002 6213 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:31.322008 6213 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:31.322013 6213 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.167769 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.173794 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6fgn\" (UniqueName: \"kubernetes.io/projected/c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b-kube-api-access-d6fgn\") pod \"ovnkube-control-plane-749d76644c-rnwxf\" (UID: \"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.182440 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.247207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.247243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.247252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.247267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.247279 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:33Z","lastTransitionTime":"2026-01-21T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.294023 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" Jan 21 15:24:33 crc kubenswrapper[4773]: W0121 15:24:33.307198 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc84731e4_6a51_4d0f_9a8b_3cfcf8b4738b.slice/crio-7a9aa6728e2a5ebbd9d443133c82ad5204ffa0e4fcf860e0b1e9565a664272e0 WatchSource:0}: Error finding container 7a9aa6728e2a5ebbd9d443133c82ad5204ffa0e4fcf860e0b1e9565a664272e0: Status 404 returned error can't find the container with id 7a9aa6728e2a5ebbd9d443133c82ad5204ffa0e4fcf860e0b1e9565a664272e0 Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.321534 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 07:50:30.599868119 +0000 UTC Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.351705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.351767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.351780 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.351796 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.351808 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:33Z","lastTransitionTime":"2026-01-21T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.383377 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.383463 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:33 crc kubenswrapper[4773]: E0121 15:24:33.383575 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:33 crc kubenswrapper[4773]: E0121 15:24:33.383759 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.454951 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.455009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.455028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.455055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.455073 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:33Z","lastTransitionTime":"2026-01-21T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.557303 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.557343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.557374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.557394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.557404 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:33Z","lastTransitionTime":"2026-01-21T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.607964 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovnkube-controller/1.log" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.611164 4773 scope.go:117] "RemoveContainer" containerID="bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.611313 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" event={"ID":"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b","Type":"ContainerStarted","Data":"8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.611365 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" event={"ID":"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b","Type":"ContainerStarted","Data":"97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.611379 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" event={"ID":"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b","Type":"ContainerStarted","Data":"7a9aa6728e2a5ebbd9d443133c82ad5204ffa0e4fcf860e0b1e9565a664272e0"} Jan 21 15:24:33 crc kubenswrapper[4773]: E0121 15:24:33.611385 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.636256 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.652116 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.664714 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.664757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.664767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.664783 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.664792 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:33Z","lastTransitionTime":"2026-01-21T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.671366 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.689118 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.705348 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.721370 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.736227 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.748733 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.758920 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.768719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.768754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.768761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.768777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.768787 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:33Z","lastTransitionTime":"2026-01-21T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.769849 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.780718 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.799372 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.812867 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.824639 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.837788 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.863340 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:24:31.321823 6213 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:31.321843 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:31.321848 6213 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:31.321866 6213 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:31.321870 6213 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:31.321893 6213 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:31.321926 6213 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:31.321942 6213 factory.go:656] Stopping watch factory\\\\nI0121 15:24:31.321955 6213 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:31.321980 6213 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 15:24:31.321990 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:31.321997 6213 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:31.322002 6213 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:31.322008 6213 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:31.322013 6213 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.870844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.870877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.870887 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.870901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.870910 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:33Z","lastTransitionTime":"2026-01-21T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.887631 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.902995 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.916451 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.930204 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.942796 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.955767 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.968295 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.972811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.972849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.972858 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.972876 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.972886 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:33Z","lastTransitionTime":"2026-01-21T15:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.981138 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:33 crc kubenswrapper[4773]: I0121 15:24:33.990788 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.001097 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:33Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.010473 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.022292 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.033537 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.043506 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.045932 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8n66g"] Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.046397 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:34 crc kubenswrapper[4773]: E0121 15:24:34.046468 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.057882 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.057949 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms2mz\" (UniqueName: \"kubernetes.io/projected/1a01fed4-2691-453e-b74f-c000d5125b53-kube-api-access-ms2mz\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.059436 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.075118 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.075149 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.075157 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.075172 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.075182 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:34Z","lastTransitionTime":"2026-01-21T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.077650 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:24:31.321823 6213 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:31.321843 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:31.321848 6213 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:31.321866 6213 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:31.321870 6213 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:31.321893 6213 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:31.321926 6213 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:31.321942 6213 factory.go:656] Stopping watch factory\\\\nI0121 15:24:31.321955 6213 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:31.321980 6213 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 15:24:31.321990 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:31.321997 6213 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:31.322002 6213 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:31.322008 6213 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:31.322013 6213 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.103730 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.142358 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.158447 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.158550 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms2mz\" (UniqueName: \"kubernetes.io/projected/1a01fed4-2691-453e-b74f-c000d5125b53-kube-api-access-ms2mz\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:34 crc kubenswrapper[4773]: E0121 15:24:34.158649 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:34 crc kubenswrapper[4773]: E0121 15:24:34.158752 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs podName:1a01fed4-2691-453e-b74f-c000d5125b53 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:34.658732617 +0000 UTC m=+39.583222239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs") pod "network-metrics-daemon-8n66g" (UID: "1a01fed4-2691-453e-b74f-c000d5125b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.177780 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.177816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.177828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.177844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.177855 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:34Z","lastTransitionTime":"2026-01-21T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.183560 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.208080 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms2mz\" (UniqueName: \"kubernetes.io/projected/1a01fed4-2691-453e-b74f-c000d5125b53-kube-api-access-ms2mz\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.241715 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.279868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.279915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.279927 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.279946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.279960 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:34Z","lastTransitionTime":"2026-01-21T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.283201 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.322428 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 22:27:57.508492695 +0000 UTC Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.326133 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.362768 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.383217 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:34 crc kubenswrapper[4773]: E0121 15:24:34.383807 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.385050 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.385101 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.385116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.385140 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.385163 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:34Z","lastTransitionTime":"2026-01-21T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.405462 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.448611 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.488483 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.488778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.488976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.489112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.489198 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:34Z","lastTransitionTime":"2026-01-21T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.496629 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.531449 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:24:31.321823 6213 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:31.321843 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:31.321848 6213 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:31.321866 6213 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:31.321870 6213 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:31.321893 6213 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:31.321926 6213 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:31.321942 6213 factory.go:656] Stopping watch factory\\\\nI0121 15:24:31.321955 6213 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:31.321980 6213 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 15:24:31.321990 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:31.321997 6213 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:31.322002 6213 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:31.322008 6213 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:31.322013 6213 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.571660 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.592219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.592262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.592274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.592290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.592302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:34Z","lastTransitionTime":"2026-01-21T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.601820 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.642129 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.663326 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:34 crc kubenswrapper[4773]: E0121 15:24:34.663875 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:34 crc kubenswrapper[4773]: E0121 15:24:34.664459 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs podName:1a01fed4-2691-453e-b74f-c000d5125b53 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:35.66442361 +0000 UTC m=+40.588913272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs") pod "network-metrics-daemon-8n66g" (UID: "1a01fed4-2691-453e-b74f-c000d5125b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.685909 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8n66g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a01fed4-2691-453e-b74f-c000d5125b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8n66g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.695354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.695390 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.695399 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.695413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.695424 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:34Z","lastTransitionTime":"2026-01-21T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.730847 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.764406 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:34Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.798028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.798248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.798321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.798382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.798442 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:34Z","lastTransitionTime":"2026-01-21T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.901445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.901515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.901542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.901572 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:34 crc kubenswrapper[4773]: I0121 15:24:34.901591 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:34Z","lastTransitionTime":"2026-01-21T15:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.003860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.004166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.004341 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.004527 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.004782 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:35Z","lastTransitionTime":"2026-01-21T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.108080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.108320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.108400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.108498 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.108582 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:35Z","lastTransitionTime":"2026-01-21T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.211417 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.211479 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.211500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.211527 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.211547 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:35Z","lastTransitionTime":"2026-01-21T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.314045 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.314077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.314089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.314103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.314113 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:35Z","lastTransitionTime":"2026-01-21T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.323563 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 00:07:53.153673655 +0000 UTC Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.383825 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.383939 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:35 crc kubenswrapper[4773]: E0121 15:24:35.384126 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:35 crc kubenswrapper[4773]: E0121 15:24:35.384415 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.395622 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.413048 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.417029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.417085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.417102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.417123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.417137 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:35Z","lastTransitionTime":"2026-01-21T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.426547 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.438482 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.457485 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.480868 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:24:31.321823 6213 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:31.321843 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:31.321848 6213 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:31.321866 6213 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:31.321870 6213 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:31.321893 6213 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:31.321926 6213 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:31.321942 6213 factory.go:656] Stopping watch factory\\\\nI0121 15:24:31.321955 6213 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:31.321980 6213 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 15:24:31.321990 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:31.321997 6213 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:31.322002 6213 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:31.322008 6213 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:31.322013 6213 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.503182 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.515773 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.520268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.520303 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.520314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.520330 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.520341 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:35Z","lastTransitionTime":"2026-01-21T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.536387 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.551049 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8n66g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a01fed4-2691-453e-b74f-c000d5125b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8n66g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.564334 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.578166 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.597934 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.612840 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.622768 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.622868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.622889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.622916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.622932 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:35Z","lastTransitionTime":"2026-01-21T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.629239 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.642226 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.655869 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:35Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.673688 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:35 crc kubenswrapper[4773]: E0121 15:24:35.674744 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:35 crc kubenswrapper[4773]: E0121 15:24:35.674872 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs podName:1a01fed4-2691-453e-b74f-c000d5125b53 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:37.674851669 +0000 UTC m=+42.599341291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs") pod "network-metrics-daemon-8n66g" (UID: "1a01fed4-2691-453e-b74f-c000d5125b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.725492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.725541 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.725556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.725577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.725593 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:35Z","lastTransitionTime":"2026-01-21T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.827768 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.827805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.827814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.827828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.827837 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:35Z","lastTransitionTime":"2026-01-21T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.931515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.931564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.931576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.931595 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:35 crc kubenswrapper[4773]: I0121 15:24:35.931606 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:35Z","lastTransitionTime":"2026-01-21T15:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.035222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.035275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.035291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.035313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.035329 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.138409 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.138452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.138464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.138481 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.138492 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.241071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.241168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.241213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.241247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.241269 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.324479 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 04:03:47.202897609 +0000 UTC Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.343953 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.344014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.344037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.344071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.344093 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.382859 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.382859 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:36 crc kubenswrapper[4773]: E0121 15:24:36.383041 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:36 crc kubenswrapper[4773]: E0121 15:24:36.383222 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.447169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.447221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.447237 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.447259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.447276 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.551046 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.551106 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.551119 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.551135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.551150 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.653411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.653462 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.653474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.653495 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.653507 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.728929 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.729020 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.729044 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.729075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.729097 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: E0121 15:24:36.749619 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:36Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.754913 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.754981 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.754998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.755022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.755041 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: E0121 15:24:36.769806 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:36Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.774387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.774474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.774493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.774781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.774794 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: E0121 15:24:36.786756 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:36Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.790129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.790164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.790178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.790209 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.790229 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: E0121 15:24:36.802597 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:36Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.806273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.806316 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.806325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.806343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.806379 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: E0121 15:24:36.816922 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:36Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:36 crc kubenswrapper[4773]: E0121 15:24:36.817030 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.818557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.818589 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.818606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.818629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.818644 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.921992 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.922085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.922104 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.922133 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:36 crc kubenswrapper[4773]: I0121 15:24:36.922153 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:36Z","lastTransitionTime":"2026-01-21T15:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.025243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.026182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.026206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.026233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.026254 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:37Z","lastTransitionTime":"2026-01-21T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.128228 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.128269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.128281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.128295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.128307 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:37Z","lastTransitionTime":"2026-01-21T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.229991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.230028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.230037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.230052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.230062 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:37Z","lastTransitionTime":"2026-01-21T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.324776 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 16:54:08.135137954 +0000 UTC Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.333099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.333168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.333179 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.333198 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.333210 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:37Z","lastTransitionTime":"2026-01-21T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.383009 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:37 crc kubenswrapper[4773]: E0121 15:24:37.383159 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.383229 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:37 crc kubenswrapper[4773]: E0121 15:24:37.383412 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.435795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.435855 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.435864 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.435885 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.435895 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:37Z","lastTransitionTime":"2026-01-21T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.538564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.538645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.538668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.538724 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.538743 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:37Z","lastTransitionTime":"2026-01-21T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.640766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.640819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.640831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.640847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.640858 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:37Z","lastTransitionTime":"2026-01-21T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.699923 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:37 crc kubenswrapper[4773]: E0121 15:24:37.700202 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:37 crc kubenswrapper[4773]: E0121 15:24:37.700364 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs podName:1a01fed4-2691-453e-b74f-c000d5125b53 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:41.70032558 +0000 UTC m=+46.624815242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs") pod "network-metrics-daemon-8n66g" (UID: "1a01fed4-2691-453e-b74f-c000d5125b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.743632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.743695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.743747 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.743774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.743791 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:37Z","lastTransitionTime":"2026-01-21T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.846410 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.846452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.846460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.846479 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.846488 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:37Z","lastTransitionTime":"2026-01-21T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.950215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.950248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.950258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.950272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:37 crc kubenswrapper[4773]: I0121 15:24:37.950283 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:37Z","lastTransitionTime":"2026-01-21T15:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.052938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.052976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.052984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.052998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.053011 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:38Z","lastTransitionTime":"2026-01-21T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.155724 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.155765 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.155780 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.155796 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.155808 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:38Z","lastTransitionTime":"2026-01-21T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.258420 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.258468 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.258482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.258498 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.258512 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:38Z","lastTransitionTime":"2026-01-21T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.325864 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 09:18:05.729882224 +0000 UTC Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.360565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.360607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.360621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.360636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.360647 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:38Z","lastTransitionTime":"2026-01-21T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.383198 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.383211 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:38 crc kubenswrapper[4773]: E0121 15:24:38.383410 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:38 crc kubenswrapper[4773]: E0121 15:24:38.383314 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.462980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.463025 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.463039 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.463056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.463068 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:38Z","lastTransitionTime":"2026-01-21T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.566457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.566526 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.566543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.566566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.566584 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:38Z","lastTransitionTime":"2026-01-21T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.668800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.668867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.668889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.668917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.668940 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:38Z","lastTransitionTime":"2026-01-21T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.771119 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.771154 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.771166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.771184 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.771197 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:38Z","lastTransitionTime":"2026-01-21T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.873620 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.873670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.873679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.873706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.873717 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:38Z","lastTransitionTime":"2026-01-21T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.976655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.976689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.976726 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.976739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:38 crc kubenswrapper[4773]: I0121 15:24:38.976747 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:38Z","lastTransitionTime":"2026-01-21T15:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.078828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.078870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.078881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.078896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.078907 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:39Z","lastTransitionTime":"2026-01-21T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.181367 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.181424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.181437 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.181453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.181462 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:39Z","lastTransitionTime":"2026-01-21T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.283148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.283186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.283194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.283205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.283214 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:39Z","lastTransitionTime":"2026-01-21T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.326164 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 08:12:33.116847567 +0000 UTC Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.382852 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.382914 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:39 crc kubenswrapper[4773]: E0121 15:24:39.383006 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:39 crc kubenswrapper[4773]: E0121 15:24:39.383166 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.385372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.385420 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.385429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.385495 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.385508 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:39Z","lastTransitionTime":"2026-01-21T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.488617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.488662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.488688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.488742 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.488755 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:39Z","lastTransitionTime":"2026-01-21T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.590764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.590822 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.590835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.590853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.590864 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:39Z","lastTransitionTime":"2026-01-21T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.693414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.693471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.693488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.693514 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.693532 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:39Z","lastTransitionTime":"2026-01-21T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.795928 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.795962 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.795971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.795986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.795994 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:39Z","lastTransitionTime":"2026-01-21T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.898078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.898177 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.898196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.898221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:39 crc kubenswrapper[4773]: I0121 15:24:39.898237 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:39Z","lastTransitionTime":"2026-01-21T15:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.001199 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.001250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.001265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.001288 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.001301 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:40Z","lastTransitionTime":"2026-01-21T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.104370 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.104409 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.104419 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.104433 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.104444 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:40Z","lastTransitionTime":"2026-01-21T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.210752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.211283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.211318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.211342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.211359 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:40Z","lastTransitionTime":"2026-01-21T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.314842 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.314912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.314934 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.314963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.314984 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:40Z","lastTransitionTime":"2026-01-21T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.327068 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 17:03:27.495460713 +0000 UTC Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.383842 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:40 crc kubenswrapper[4773]: E0121 15:24:40.384035 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.384468 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:40 crc kubenswrapper[4773]: E0121 15:24:40.384612 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.417895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.418130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.418229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.418329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.418416 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:40Z","lastTransitionTime":"2026-01-21T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.521961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.522033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.522052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.522080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.522099 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:40Z","lastTransitionTime":"2026-01-21T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.625209 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.625245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.625255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.625270 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.625282 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:40Z","lastTransitionTime":"2026-01-21T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.728136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.728227 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.728247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.728280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.728308 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:40Z","lastTransitionTime":"2026-01-21T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.831121 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.831231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.831254 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.831293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.831318 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:40Z","lastTransitionTime":"2026-01-21T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.933902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.933984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.933999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.934040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:40 crc kubenswrapper[4773]: I0121 15:24:40.934052 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:40Z","lastTransitionTime":"2026-01-21T15:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.036637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.036670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.036679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.036706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.036717 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:41Z","lastTransitionTime":"2026-01-21T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.139628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.139682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.139727 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.139752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.139767 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:41Z","lastTransitionTime":"2026-01-21T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.242882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.242931 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.242942 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.242964 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.242978 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:41Z","lastTransitionTime":"2026-01-21T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.327527 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 07:45:04.707197347 +0000 UTC Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.346219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.346271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.346285 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.346307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.346324 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:41Z","lastTransitionTime":"2026-01-21T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.382969 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.383032 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:41 crc kubenswrapper[4773]: E0121 15:24:41.383168 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:41 crc kubenswrapper[4773]: E0121 15:24:41.383247 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.449574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.449641 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.449659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.449687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.449736 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:41Z","lastTransitionTime":"2026-01-21T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.552863 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.552972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.552992 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.553019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.553038 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:41Z","lastTransitionTime":"2026-01-21T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.655965 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.656018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.656031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.656052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.656067 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:41Z","lastTransitionTime":"2026-01-21T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.745506 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:41 crc kubenswrapper[4773]: E0121 15:24:41.745676 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:41 crc kubenswrapper[4773]: E0121 15:24:41.745776 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs podName:1a01fed4-2691-453e-b74f-c000d5125b53 nodeName:}" failed. No retries permitted until 2026-01-21 15:24:49.745757698 +0000 UTC m=+54.670247340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs") pod "network-metrics-daemon-8n66g" (UID: "1a01fed4-2691-453e-b74f-c000d5125b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.758964 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.759002 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.759016 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.759032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.759043 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:41Z","lastTransitionTime":"2026-01-21T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.862648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.862933 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.863017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.863142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.863233 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:41Z","lastTransitionTime":"2026-01-21T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.966096 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.966136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.966147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.966167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:41 crc kubenswrapper[4773]: I0121 15:24:41.966179 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:41Z","lastTransitionTime":"2026-01-21T15:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.068804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.068853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.068863 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.068883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.068893 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:42Z","lastTransitionTime":"2026-01-21T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.171727 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.171803 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.171817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.171844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.171858 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:42Z","lastTransitionTime":"2026-01-21T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.274809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.274859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.274873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.274891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.274906 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:42Z","lastTransitionTime":"2026-01-21T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.327765 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 18:39:00.845185445 +0000 UTC Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.382397 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.382444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.382455 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.382472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.382482 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:42Z","lastTransitionTime":"2026-01-21T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.382756 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:42 crc kubenswrapper[4773]: E0121 15:24:42.382869 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.382946 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:42 crc kubenswrapper[4773]: E0121 15:24:42.383252 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.484991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.485051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.485066 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.485085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.485099 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:42Z","lastTransitionTime":"2026-01-21T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.588914 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.588984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.588996 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.589017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.589053 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:42Z","lastTransitionTime":"2026-01-21T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.691188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.691221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.691230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.691243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.691253 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:42Z","lastTransitionTime":"2026-01-21T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.793062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.793114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.793128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.793147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.793161 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:42Z","lastTransitionTime":"2026-01-21T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.896125 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.896163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.896172 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.896186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.896196 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:42Z","lastTransitionTime":"2026-01-21T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.998903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.999008 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.999026 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.999074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:42 crc kubenswrapper[4773]: I0121 15:24:42.999091 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:42Z","lastTransitionTime":"2026-01-21T15:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.101014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.101073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.101090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.101112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.101127 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:43Z","lastTransitionTime":"2026-01-21T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.204240 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.204293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.204308 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.204330 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.204343 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:43Z","lastTransitionTime":"2026-01-21T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.306932 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.307011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.307021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.307035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.307047 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:43Z","lastTransitionTime":"2026-01-21T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.328825 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 17:39:17.362470264 +0000 UTC Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.383493 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.383550 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:43 crc kubenswrapper[4773]: E0121 15:24:43.383733 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:43 crc kubenswrapper[4773]: E0121 15:24:43.383921 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.409318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.409366 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.409384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.409400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.409411 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:43Z","lastTransitionTime":"2026-01-21T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.511532 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.511592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.511602 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.511619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.511629 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:43Z","lastTransitionTime":"2026-01-21T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.615056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.615107 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.615120 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.615139 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.615151 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:43Z","lastTransitionTime":"2026-01-21T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.718239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.718301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.718320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.718340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.718352 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:43Z","lastTransitionTime":"2026-01-21T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.821281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.821336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.821353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.821375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.821394 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:43Z","lastTransitionTime":"2026-01-21T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.923445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.923491 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.923504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.923522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:43 crc kubenswrapper[4773]: I0121 15:24:43.923535 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:43Z","lastTransitionTime":"2026-01-21T15:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.025933 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.025967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.025978 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.025994 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.026006 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:44Z","lastTransitionTime":"2026-01-21T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.129079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.129117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.129127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.129140 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.129150 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:44Z","lastTransitionTime":"2026-01-21T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.232254 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.232301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.232310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.232323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.232332 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:44Z","lastTransitionTime":"2026-01-21T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.329778 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 13:25:39.415057435 +0000 UTC Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.335070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.335139 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.335155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.335173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.335185 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:44Z","lastTransitionTime":"2026-01-21T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.382772 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.382759 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:44 crc kubenswrapper[4773]: E0121 15:24:44.382930 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:44 crc kubenswrapper[4773]: E0121 15:24:44.383118 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.437183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.437214 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.437225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.437241 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.437251 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:44Z","lastTransitionTime":"2026-01-21T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.543991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.544045 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.544059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.544077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.544092 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:44Z","lastTransitionTime":"2026-01-21T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.646437 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.646481 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.646490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.646505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.646514 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:44Z","lastTransitionTime":"2026-01-21T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.749061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.749114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.749124 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.749142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.749152 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:44Z","lastTransitionTime":"2026-01-21T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.851505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.851557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.851569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.851586 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.851598 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:44Z","lastTransitionTime":"2026-01-21T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.954266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.954302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.954312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.954326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:44 crc kubenswrapper[4773]: I0121 15:24:44.954335 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:44Z","lastTransitionTime":"2026-01-21T15:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.057493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.057554 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.057564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.057580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.057593 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:45Z","lastTransitionTime":"2026-01-21T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.161860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.161915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.161928 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.161951 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.161968 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:45Z","lastTransitionTime":"2026-01-21T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.212980 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.213271 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:17.213229059 +0000 UTC m=+82.137718841 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.213429 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.213647 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.213674 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.213689 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.213763 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:25:17.213753702 +0000 UTC m=+82.138243334 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.264832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.264896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.264916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.264940 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.264977 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:45Z","lastTransitionTime":"2026-01-21T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.315099 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.315195 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.315238 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.315318 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.315393 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.315398 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.315474 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:25:17.315432614 +0000 UTC m=+82.239922256 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.315484 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.315510 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.315515 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:25:17.315498075 +0000 UTC m=+82.239987717 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.315566 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:25:17.315542606 +0000 UTC m=+82.240032268 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.330273 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 12:01:47.313153428 +0000 UTC Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.367875 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.367965 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.367989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.368018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.368036 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:45Z","lastTransitionTime":"2026-01-21T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.383462 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.383534 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.383594 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:45 crc kubenswrapper[4773]: E0121 15:24:45.383726 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.396834 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.412626 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.430402 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.443967 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.470021 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.472545 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.472613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.472633 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.472669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.472731 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:45Z","lastTransitionTime":"2026-01-21T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.485490 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.501209 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.516332 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.537492 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:24:31.321823 6213 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:31.321843 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:31.321848 6213 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:31.321866 6213 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:31.321870 6213 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:31.321893 6213 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:31.321926 6213 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:31.321942 6213 factory.go:656] Stopping watch factory\\\\nI0121 15:24:31.321955 6213 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:31.321980 6213 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 15:24:31.321990 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:31.321997 6213 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:31.322002 6213 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:31.322008 6213 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:31.322013 6213 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.550593 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.564926 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.574787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.574831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.574846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.574866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.574881 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:45Z","lastTransitionTime":"2026-01-21T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.585872 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.602474 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.617229 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.628837 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8n66g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a01fed4-2691-453e-b74f-c000d5125b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8n66g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.648641 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.665396 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:45Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.678107 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.678149 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.678178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.678195 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.678205 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:45Z","lastTransitionTime":"2026-01-21T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.781460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.781528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.781542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.781566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.781581 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:45Z","lastTransitionTime":"2026-01-21T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.884285 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.884385 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.884403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.884451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.884478 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:45Z","lastTransitionTime":"2026-01-21T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.987097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.987133 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.987142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.987155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:45 crc kubenswrapper[4773]: I0121 15:24:45.987165 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:45Z","lastTransitionTime":"2026-01-21T15:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.089869 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.089912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.089923 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.089939 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.089949 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.193293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.193348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.193361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.193381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.193391 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.296656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.296731 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.296750 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.296775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.296792 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.330483 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 10:37:09.642772418 +0000 UTC Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.383098 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.383126 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:46 crc kubenswrapper[4773]: E0121 15:24:46.383243 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:46 crc kubenswrapper[4773]: E0121 15:24:46.383398 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.399963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.399998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.400006 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.400019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.400029 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.502489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.502530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.502542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.502561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.502573 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.605259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.605301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.605310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.605326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.605335 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.708014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.708052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.708078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.708090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.708100 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.810915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.810956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.810968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.810986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.810995 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.838387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.838426 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.838433 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.838447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.838456 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: E0121 15:24:46.851510 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:46Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.855173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.855200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.855208 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.855238 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.855255 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: E0121 15:24:46.866390 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:46Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.869568 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.869611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.869619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.869635 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.869646 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: E0121 15:24:46.881408 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:46Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.885118 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.885164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.885174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.885190 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.885199 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: E0121 15:24:46.898385 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:46Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.902302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.902362 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.902373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.902390 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.902402 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:46 crc kubenswrapper[4773]: E0121 15:24:46.914963 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:46Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:46 crc kubenswrapper[4773]: E0121 15:24:46.915202 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.916917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.916967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.916978 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.916999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:46 crc kubenswrapper[4773]: I0121 15:24:46.917014 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:46Z","lastTransitionTime":"2026-01-21T15:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.019384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.019425 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.019438 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.019467 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.019480 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:47Z","lastTransitionTime":"2026-01-21T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.121313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.121348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.121356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.121369 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.121379 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:47Z","lastTransitionTime":"2026-01-21T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.223915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.223955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.223968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.223985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.223998 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:47Z","lastTransitionTime":"2026-01-21T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.327308 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.327764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.327916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.328021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.328100 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:47Z","lastTransitionTime":"2026-01-21T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.331522 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 09:40:39.781920736 +0000 UTC Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.383437 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:47 crc kubenswrapper[4773]: E0121 15:24:47.383610 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.383726 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:47 crc kubenswrapper[4773]: E0121 15:24:47.384115 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.384560 4773 scope.go:117] "RemoveContainer" containerID="bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.430741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.430959 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.431271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.431613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.431991 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:47Z","lastTransitionTime":"2026-01-21T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.535296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.535360 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.535374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.535392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.535404 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:47Z","lastTransitionTime":"2026-01-21T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.638573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.638634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.638645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.638668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.638683 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:47Z","lastTransitionTime":"2026-01-21T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.661048 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovnkube-controller/1.log" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.662923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.663064 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.683839 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.701202 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.716874 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.728932 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.741596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.741639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.741654 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.741721 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.741739 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:47Z","lastTransitionTime":"2026-01-21T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.744778 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.760813 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.782278 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.798374 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.817622 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.845524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.845579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.845593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.845615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.845631 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:47Z","lastTransitionTime":"2026-01-21T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.848443 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:24:31.321823 6213 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:31.321843 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:31.321848 6213 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:31.321866 6213 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:31.321870 6213 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:31.321893 6213 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:31.321926 6213 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:31.321942 6213 factory.go:656] Stopping watch factory\\\\nI0121 15:24:31.321955 6213 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:31.321980 6213 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 15:24:31.321990 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:31.321997 6213 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:31.322002 6213 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:31.322008 6213 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:31.322013 6213 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.860054 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.878809 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.893438 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.908005 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.918288 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8n66g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a01fed4-2691-453e-b74f-c000d5125b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8n66g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.933109 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.952130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.952189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.952203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.952227 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.952242 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:47Z","lastTransitionTime":"2026-01-21T15:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:47 crc kubenswrapper[4773]: I0121 15:24:47.952822 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:47Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.055287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.055347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.055358 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.055373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.055383 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:48Z","lastTransitionTime":"2026-01-21T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.158807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.159306 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.159319 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.159337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.159348 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:48Z","lastTransitionTime":"2026-01-21T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.262961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.263023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.263037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.263058 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.263072 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:48Z","lastTransitionTime":"2026-01-21T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.332043 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 03:02:26.797351092 +0000 UTC Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.365048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.365378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.365456 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.365532 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.365644 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:48Z","lastTransitionTime":"2026-01-21T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.383316 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.383464 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:48 crc kubenswrapper[4773]: E0121 15:24:48.383592 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:48 crc kubenswrapper[4773]: E0121 15:24:48.383729 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.467853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.467902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.467911 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.467925 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.467935 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:48Z","lastTransitionTime":"2026-01-21T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.572929 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.572995 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.573009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.573029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.573042 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:48Z","lastTransitionTime":"2026-01-21T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.668762 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovnkube-controller/2.log" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.669243 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovnkube-controller/1.log" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.671998 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0" exitCode=1 Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.672064 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0"} Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.672136 4773 scope.go:117] "RemoveContainer" containerID="bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.673568 4773 scope.go:117] "RemoveContainer" containerID="40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0" Jan 21 15:24:48 crc kubenswrapper[4773]: E0121 15:24:48.674060 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.675245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.675307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.675335 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.675363 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.675386 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:48Z","lastTransitionTime":"2026-01-21T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.697949 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.711596 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.725108 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.739021 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.754848 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.766843 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.779167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.779205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.779216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.779235 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.779251 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:48Z","lastTransitionTime":"2026-01-21T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.780204 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.790095 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.801052 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.833760 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.847555 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.866191 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.882072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.882432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.882506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.882728 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.882817 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:48Z","lastTransitionTime":"2026-01-21T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.891864 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:24:31.321823 6213 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:31.321843 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:31.321848 6213 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:31.321866 6213 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:31.321870 6213 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:31.321893 6213 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:31.321926 6213 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:31.321942 6213 factory.go:656] Stopping watch factory\\\\nI0121 15:24:31.321955 6213 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:31.321980 6213 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 15:24:31.321990 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:31.321997 6213 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:31.322002 6213 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:31.322008 6213 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:31.322013 6213 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:48Z\\\",\\\"message\\\":\\\"rrent time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z]\\\\nI0121 15:24:48.227871 6438 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf\\\\nI0121 15:24:48.229971 6438 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:24:48.229998 6438 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.917186 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.931143 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.944038 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.960932 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8n66g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a01fed4-2691-453e-b74f-c000d5125b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8n66g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.985373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.985445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.985462 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.985492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:48 crc kubenswrapper[4773]: I0121 15:24:48.985511 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:48Z","lastTransitionTime":"2026-01-21T15:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.088624 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.088678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.088689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.088724 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.088735 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:49Z","lastTransitionTime":"2026-01-21T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.190985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.191033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.191043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.191061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.191076 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:49Z","lastTransitionTime":"2026-01-21T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.293801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.293849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.293866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.293890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.293901 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:49Z","lastTransitionTime":"2026-01-21T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.333122 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 11:15:52.020749013 +0000 UTC Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.383851 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.383919 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:49 crc kubenswrapper[4773]: E0121 15:24:49.384018 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:49 crc kubenswrapper[4773]: E0121 15:24:49.384115 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.395794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.395836 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.395845 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.395861 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.395873 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:49Z","lastTransitionTime":"2026-01-21T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.499536 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.499654 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.499682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.499767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.499798 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:49Z","lastTransitionTime":"2026-01-21T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.602845 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.602895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.602904 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.602923 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.602935 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:49Z","lastTransitionTime":"2026-01-21T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.678942 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovnkube-controller/2.log" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.706336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.706374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.706384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.706404 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.706413 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:49Z","lastTransitionTime":"2026-01-21T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.762547 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:49 crc kubenswrapper[4773]: E0121 15:24:49.762824 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:49 crc kubenswrapper[4773]: E0121 15:24:49.762976 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs podName:1a01fed4-2691-453e-b74f-c000d5125b53 nodeName:}" failed. No retries permitted until 2026-01-21 15:25:05.762942233 +0000 UTC m=+70.687432055 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs") pod "network-metrics-daemon-8n66g" (UID: "1a01fed4-2691-453e-b74f-c000d5125b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.809813 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.809892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.809910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.809939 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.809956 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:49Z","lastTransitionTime":"2026-01-21T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.913406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.913469 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.913483 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.913507 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:49 crc kubenswrapper[4773]: I0121 15:24:49.913524 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:49Z","lastTransitionTime":"2026-01-21T15:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.015629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.015665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.015673 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.015688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.015721 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:50Z","lastTransitionTime":"2026-01-21T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.117996 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.118043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.118057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.118075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.118087 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:50Z","lastTransitionTime":"2026-01-21T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.131496 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.141361 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.143456 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.155163 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.165781 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.176288 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.187883 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.201273 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.208196 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.209142 4773 scope.go:117] "RemoveContainer" containerID="40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0" Jan 21 15:24:50 crc kubenswrapper[4773]: E0121 15:24:50.209358 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.213396 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.220162 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.220336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.220400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.220489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.220549 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:50Z","lastTransitionTime":"2026-01-21T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.225144 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.237788 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.253619 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb021c180e8a692e0b7863b3528d0b3ec884e2ca75e383253be0042be777596\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 15:24:31.321823 6213 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 15:24:31.321843 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0121 15:24:31.321848 6213 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0121 15:24:31.321866 6213 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0121 15:24:31.321870 6213 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0121 15:24:31.321893 6213 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0121 15:24:31.321926 6213 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0121 15:24:31.321942 6213 factory.go:656] Stopping watch factory\\\\nI0121 15:24:31.321955 6213 ovnkube.go:599] Stopped ovnkube\\\\nI0121 15:24:31.321980 6213 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 15:24:31.321990 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI0121 15:24:31.321997 6213 handler.go:208] Removed *v1.Node event handler 7\\\\nI0121 15:24:31.322002 6213 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0121 15:24:31.322008 6213 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0121 15:24:31.322013 6213 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0121 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:48Z\\\",\\\"message\\\":\\\"rrent time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z]\\\\nI0121 15:24:48.227871 6438 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf\\\\nI0121 15:24:48.229971 6438 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:24:48.229998 6438 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.262719 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.274456 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.287180 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.300674 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.314035 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.322846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.323081 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.323168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.323255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.323358 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:50Z","lastTransitionTime":"2026-01-21T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.325496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8n66g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a01fed4-2691-453e-b74f-c000d5125b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8n66g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.334013 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 04:28:05.748303193 +0000 UTC Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.350932 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.371189 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.382055 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abc7919e-a9f2-4e74-84b2-d45532f88119\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7177d316f870571dc5481c145d33cb7045938fe0537f51543f83991cdc44ebd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a91e99071784df2b01fef79a6d8bbcef3db7b0f853f6523009acfc83c23e2c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c0a968ae28c1f228af6e4ee3cd2eed0a13b2398efe4cbab048c6e973746924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.383051 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:50 crc kubenswrapper[4773]: E0121 15:24:50.383167 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.383771 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:50 crc kubenswrapper[4773]: E0121 15:24:50.383927 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.394880 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.410004 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.424182 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8n66g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a01fed4-2691-453e-b74f-c000d5125b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8n66g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.425683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.425748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.425760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.425780 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.425794 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:50Z","lastTransitionTime":"2026-01-21T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.437335 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.448475 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.461900 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.482860 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.496906 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.510040 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.522057 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.529490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.529564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.529588 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.529620 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.529644 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:50Z","lastTransitionTime":"2026-01-21T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.535461 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.550380 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.561823 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.576993 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.595755 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:48Z\\\",\\\"message\\\":\\\"rrent time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z]\\\\nI0121 15:24:48.227871 6438 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf\\\\nI0121 15:24:48.229971 6438 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:24:48.229998 6438 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.608253 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:50Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.632564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.632609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.632618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.632634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.632645 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:50Z","lastTransitionTime":"2026-01-21T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.736155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.736206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.736222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.736248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.736264 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:50Z","lastTransitionTime":"2026-01-21T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.841860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.841902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.841912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.841935 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.841956 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:50Z","lastTransitionTime":"2026-01-21T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.943909 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.943938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.943948 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.943963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:50 crc kubenswrapper[4773]: I0121 15:24:50.943972 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:50Z","lastTransitionTime":"2026-01-21T15:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.045757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.046005 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.046142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.046236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.046298 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:51Z","lastTransitionTime":"2026-01-21T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.149191 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.149669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.149783 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.149877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.149947 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:51Z","lastTransitionTime":"2026-01-21T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.252048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.252114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.252125 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.252141 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.252336 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:51Z","lastTransitionTime":"2026-01-21T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.334850 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 01:12:58.078059448 +0000 UTC Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.354796 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.354855 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.354866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.354884 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.354896 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:51Z","lastTransitionTime":"2026-01-21T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.383317 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.383458 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:51 crc kubenswrapper[4773]: E0121 15:24:51.383589 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:51 crc kubenswrapper[4773]: E0121 15:24:51.383751 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.456958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.457013 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.457024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.457042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.457055 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:51Z","lastTransitionTime":"2026-01-21T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.559800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.559846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.559859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.559877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.559890 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:51Z","lastTransitionTime":"2026-01-21T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.662524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.662565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.662575 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.662590 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.662599 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:51Z","lastTransitionTime":"2026-01-21T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.766280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.766334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.766347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.766368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.766380 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:51Z","lastTransitionTime":"2026-01-21T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.868168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.868234 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.868258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.868285 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.868302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:51Z","lastTransitionTime":"2026-01-21T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.971259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.971295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.971305 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.971321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:51 crc kubenswrapper[4773]: I0121 15:24:51.971332 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:51Z","lastTransitionTime":"2026-01-21T15:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.073570 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.073625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.073638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.073659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.073671 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:52Z","lastTransitionTime":"2026-01-21T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.176380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.176426 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.176436 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.176458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.176468 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:52Z","lastTransitionTime":"2026-01-21T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.278876 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.278923 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.278932 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.278948 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.278957 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:52Z","lastTransitionTime":"2026-01-21T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.335942 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:06:57.488938798 +0000 UTC Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.381930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.381975 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.381987 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.382003 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.382012 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:52Z","lastTransitionTime":"2026-01-21T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.383148 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.383148 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:52 crc kubenswrapper[4773]: E0121 15:24:52.383258 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:52 crc kubenswrapper[4773]: E0121 15:24:52.383386 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.484724 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.484782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.484797 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.484819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.484837 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:52Z","lastTransitionTime":"2026-01-21T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.587558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.587644 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.587664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.587684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.587717 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:52Z","lastTransitionTime":"2026-01-21T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.689924 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.689979 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.689990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.690008 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.690021 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:52Z","lastTransitionTime":"2026-01-21T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.792213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.792252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.792260 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.792277 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.792287 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:52Z","lastTransitionTime":"2026-01-21T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.894181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.894226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.894237 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.894257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.894269 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:52Z","lastTransitionTime":"2026-01-21T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.996968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.997011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.997020 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.997037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:52 crc kubenswrapper[4773]: I0121 15:24:52.997045 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:52Z","lastTransitionTime":"2026-01-21T15:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.099788 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.099845 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.099860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.099878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.099889 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:53Z","lastTransitionTime":"2026-01-21T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.201716 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.201749 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.201758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.201771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.201780 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:53Z","lastTransitionTime":"2026-01-21T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.303708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.303751 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.303761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.303776 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.303785 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:53Z","lastTransitionTime":"2026-01-21T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.336062 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 04:19:19.742859862 +0000 UTC Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.383530 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.383530 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:53 crc kubenswrapper[4773]: E0121 15:24:53.383715 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:53 crc kubenswrapper[4773]: E0121 15:24:53.383762 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.405485 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.405527 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.405539 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.405557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.405569 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:53Z","lastTransitionTime":"2026-01-21T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.508333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.508376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.508384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.508400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.508411 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:53Z","lastTransitionTime":"2026-01-21T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.610743 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.610797 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.610811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.610832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.610846 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:53Z","lastTransitionTime":"2026-01-21T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.712853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.712896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.712906 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.712921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.712932 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:53Z","lastTransitionTime":"2026-01-21T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.815838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.815883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.815894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.815910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.815920 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:53Z","lastTransitionTime":"2026-01-21T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.918508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.918557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.918567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.918584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:53 crc kubenswrapper[4773]: I0121 15:24:53.918596 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:53Z","lastTransitionTime":"2026-01-21T15:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.020482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.020539 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.020549 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.020562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.020572 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:54Z","lastTransitionTime":"2026-01-21T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.123329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.123374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.123394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.123411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.123423 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:54Z","lastTransitionTime":"2026-01-21T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.225226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.225269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.225281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.225299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.225314 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:54Z","lastTransitionTime":"2026-01-21T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.327524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.327590 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.327602 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.327619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.327628 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:54Z","lastTransitionTime":"2026-01-21T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.336882 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 07:01:44.132579982 +0000 UTC Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.385394 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.385419 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:54 crc kubenswrapper[4773]: E0121 15:24:54.385554 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:54 crc kubenswrapper[4773]: E0121 15:24:54.385659 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.430852 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.430901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.430911 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.430931 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.430943 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:54Z","lastTransitionTime":"2026-01-21T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.533764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.533829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.533840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.533857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.533868 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:54Z","lastTransitionTime":"2026-01-21T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.636142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.636200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.636210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.636228 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.636241 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:54Z","lastTransitionTime":"2026-01-21T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.739239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.739287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.739295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.739312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.739323 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:54Z","lastTransitionTime":"2026-01-21T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.841855 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.841908 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.841920 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.841941 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.841954 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:54Z","lastTransitionTime":"2026-01-21T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.945332 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.945395 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.945408 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.945427 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:54 crc kubenswrapper[4773]: I0121 15:24:54.945445 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:54Z","lastTransitionTime":"2026-01-21T15:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.048280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.048330 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.048342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.048380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.048391 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:55Z","lastTransitionTime":"2026-01-21T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.150111 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.150154 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.150164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.150178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.150189 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:55Z","lastTransitionTime":"2026-01-21T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.252321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.252364 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.252373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.252388 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.252397 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:55Z","lastTransitionTime":"2026-01-21T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.337857 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 18:24:27.928431429 +0000 UTC Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.355246 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.355552 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.355633 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.355782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.355861 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:55Z","lastTransitionTime":"2026-01-21T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.383650 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.383762 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:55 crc kubenswrapper[4773]: E0121 15:24:55.383831 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:55 crc kubenswrapper[4773]: E0121 15:24:55.383934 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.397240 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.408826 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8n66g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a01fed4-2691-453e-b74f-c000d5125b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8n66g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.440082 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.457442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.457508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.457520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.457562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.457577 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:55Z","lastTransitionTime":"2026-01-21T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.461302 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abc7919e-a9f2-4e74-84b2-d45532f88119\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7177d316f870571dc5481c145d33cb7045938fe0537f51543f83991cdc44ebd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a91e99071784df2b01fef79a6d8bbcef3db7b0f853f6523009acfc83c23e2c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c0a968ae28c1f228af6e4ee3cd2eed0a13b2398efe4cbab048c6e973746924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.478829 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.489641 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.499610 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.507773 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.517406 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.528039 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.538004 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.549869 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.560523 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.560560 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.560571 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.560588 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.560600 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:55Z","lastTransitionTime":"2026-01-21T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.563405 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.579526 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:48Z\\\",\\\"message\\\":\\\"rrent time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z]\\\\nI0121 15:24:48.227871 6438 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf\\\\nI0121 15:24:48.229971 6438 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:24:48.229998 6438 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.589246 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.600904 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.612429 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.622321 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:55Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.663041 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.663088 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.663104 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.663125 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.663137 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:55Z","lastTransitionTime":"2026-01-21T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.766421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.766470 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.766479 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.766499 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.766509 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:55Z","lastTransitionTime":"2026-01-21T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.868521 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.868570 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.868587 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.868606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.868619 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:55Z","lastTransitionTime":"2026-01-21T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.971349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.971391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.971399 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.971412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:55 crc kubenswrapper[4773]: I0121 15:24:55.971421 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:55Z","lastTransitionTime":"2026-01-21T15:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.074264 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.074300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.074309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.074325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.074338 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:56Z","lastTransitionTime":"2026-01-21T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.176720 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.176769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.176781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.176799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.176812 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:56Z","lastTransitionTime":"2026-01-21T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.279359 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.279397 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.279407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.279423 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.279434 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:56Z","lastTransitionTime":"2026-01-21T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.338978 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 21:43:44.660642222 +0000 UTC Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.382256 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.382302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.382314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.382332 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.382344 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:56Z","lastTransitionTime":"2026-01-21T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.382642 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.382726 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:56 crc kubenswrapper[4773]: E0121 15:24:56.382781 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:56 crc kubenswrapper[4773]: E0121 15:24:56.382878 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.485337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.485391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.485403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.485430 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.485446 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:56Z","lastTransitionTime":"2026-01-21T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.588252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.588317 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.588327 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.588362 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.588378 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:56Z","lastTransitionTime":"2026-01-21T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.690447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.690490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.690499 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.690513 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.690524 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:56Z","lastTransitionTime":"2026-01-21T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.793099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.793155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.793164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.793181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.793191 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:56Z","lastTransitionTime":"2026-01-21T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.895368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.895419 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.895431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.895446 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:56 crc kubenswrapper[4773]: I0121 15:24:56.895455 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:56Z","lastTransitionTime":"2026-01-21T15:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.000645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.000724 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.000748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.000768 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.000783 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.078372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.078421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.078430 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.078447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.078459 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: E0121 15:24:57.089424 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.093316 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.093362 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.093377 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.093399 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.093414 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: E0121 15:24:57.108082 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.112860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.112922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.112937 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.112958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.112973 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: E0121 15:24:57.126262 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.129753 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.129787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.129798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.129814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.129823 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: E0121 15:24:57.142906 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.146214 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.146263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.146275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.146293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.146308 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: E0121 15:24:57.156573 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:24:57Z is after 2025-08-24T17:21:41Z" Jan 21 15:24:57 crc kubenswrapper[4773]: E0121 15:24:57.156679 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.158108 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.158150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.158161 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.158180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.158193 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.260683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.260749 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.260762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.260778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.260790 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.340025 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 12:24:40.182022323 +0000 UTC Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.363684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.363755 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.363768 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.363809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.363822 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.383129 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.383199 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:57 crc kubenswrapper[4773]: E0121 15:24:57.383281 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:57 crc kubenswrapper[4773]: E0121 15:24:57.383427 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.466200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.466247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.466261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.466277 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.466286 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.568378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.568413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.568421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.568435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.568446 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.670243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.670312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.670328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.670345 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.670356 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.773280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.773334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.773346 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.773367 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.773380 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.875349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.875408 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.875424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.875450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.875467 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.978477 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.978528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.978537 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.978553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:57 crc kubenswrapper[4773]: I0121 15:24:57.978565 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:57Z","lastTransitionTime":"2026-01-21T15:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.081049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.081090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.081100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.081119 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.081128 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:58Z","lastTransitionTime":"2026-01-21T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.184167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.184238 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.184252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.184272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.184286 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:58Z","lastTransitionTime":"2026-01-21T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.286677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.286740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.286757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.286779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.286789 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:58Z","lastTransitionTime":"2026-01-21T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.340355 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 07:21:51.816366907 +0000 UTC Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.383317 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.383355 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:24:58 crc kubenswrapper[4773]: E0121 15:24:58.383505 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:24:58 crc kubenswrapper[4773]: E0121 15:24:58.383668 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.389164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.389210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.389219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.389231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.389242 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:58Z","lastTransitionTime":"2026-01-21T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.491486 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.491536 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.491545 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.491561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.491577 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:58Z","lastTransitionTime":"2026-01-21T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.593900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.593943 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.593954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.593970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.593984 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:58Z","lastTransitionTime":"2026-01-21T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.699766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.699827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.699959 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.700012 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.700154 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:58Z","lastTransitionTime":"2026-01-21T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.802998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.803041 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.803051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.803066 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.803078 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:58Z","lastTransitionTime":"2026-01-21T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.905251 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.905299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.905311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.905328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:58 crc kubenswrapper[4773]: I0121 15:24:58.905339 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:58Z","lastTransitionTime":"2026-01-21T15:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.007463 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.007506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.007519 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.007535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.007547 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:59Z","lastTransitionTime":"2026-01-21T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.110008 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.110049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.110065 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.110081 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.110092 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:59Z","lastTransitionTime":"2026-01-21T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.211879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.211961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.211986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.212057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.212087 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:59Z","lastTransitionTime":"2026-01-21T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.314752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.314824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.314837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.314857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.314878 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:59Z","lastTransitionTime":"2026-01-21T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.341051 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 02:06:28.456562536 +0000 UTC Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.383759 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.383828 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:24:59 crc kubenswrapper[4773]: E0121 15:24:59.383936 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:24:59 crc kubenswrapper[4773]: E0121 15:24:59.384013 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.417304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.417344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.417359 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.417378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.417394 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:59Z","lastTransitionTime":"2026-01-21T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.519321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.519377 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.519389 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.519409 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.519421 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:59Z","lastTransitionTime":"2026-01-21T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.621877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.621928 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.621938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.621953 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.621963 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:59Z","lastTransitionTime":"2026-01-21T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.724672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.724774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.724786 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.724805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.724818 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:59Z","lastTransitionTime":"2026-01-21T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.828007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.828072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.828086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.828106 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.828119 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:59Z","lastTransitionTime":"2026-01-21T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.930676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.930742 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.930757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.930776 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:24:59 crc kubenswrapper[4773]: I0121 15:24:59.930789 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:24:59Z","lastTransitionTime":"2026-01-21T15:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.033274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.033317 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.033326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.033340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.033351 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:00Z","lastTransitionTime":"2026-01-21T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.135740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.135804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.135816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.135837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.135850 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:00Z","lastTransitionTime":"2026-01-21T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.239352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.239396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.239408 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.239427 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.239438 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:00Z","lastTransitionTime":"2026-01-21T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.341196 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 01:19:42.895719663 +0000 UTC Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.341989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.342051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.342062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.342080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.342091 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:00Z","lastTransitionTime":"2026-01-21T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.382775 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.382834 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:00 crc kubenswrapper[4773]: E0121 15:25:00.382935 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:25:00 crc kubenswrapper[4773]: E0121 15:25:00.383109 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.444814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.444874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.444887 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.444907 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.444921 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:00Z","lastTransitionTime":"2026-01-21T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.547962 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.548022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.548041 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.548063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.548076 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:00Z","lastTransitionTime":"2026-01-21T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.650438 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.650482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.650496 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.650515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.650526 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:00Z","lastTransitionTime":"2026-01-21T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.753379 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.753420 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.753429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.753447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.753459 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:00Z","lastTransitionTime":"2026-01-21T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.856155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.856215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.856229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.856249 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.856266 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:00Z","lastTransitionTime":"2026-01-21T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.958773 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.958905 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.958923 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.958943 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:00 crc kubenswrapper[4773]: I0121 15:25:00.958982 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:00Z","lastTransitionTime":"2026-01-21T15:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.062529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.062592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.062606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.062627 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.062641 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:01Z","lastTransitionTime":"2026-01-21T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.166007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.166067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.166077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.166092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.166104 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:01Z","lastTransitionTime":"2026-01-21T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.268965 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.269023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.269044 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.269070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.269084 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:01Z","lastTransitionTime":"2026-01-21T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.341737 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 07:54:38.255019618 +0000 UTC Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.371932 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.371982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.371992 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.372010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.372021 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:01Z","lastTransitionTime":"2026-01-21T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.383406 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.383575 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:01 crc kubenswrapper[4773]: E0121 15:25:01.383903 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:25:01 crc kubenswrapper[4773]: E0121 15:25:01.383907 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.474846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.474880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.474890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.474904 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.474913 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:01Z","lastTransitionTime":"2026-01-21T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.577735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.578068 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.578164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.578254 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.578332 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:01Z","lastTransitionTime":"2026-01-21T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.680462 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.680506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.680517 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.680536 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.680549 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:01Z","lastTransitionTime":"2026-01-21T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.783550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.783611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.783654 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.783678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.783724 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:01Z","lastTransitionTime":"2026-01-21T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.886478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.886535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.886548 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.886570 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.886586 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:01Z","lastTransitionTime":"2026-01-21T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.989313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.989564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.989629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.989728 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:01 crc kubenswrapper[4773]: I0121 15:25:01.989872 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:01Z","lastTransitionTime":"2026-01-21T15:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.093243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.093324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.093342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.093398 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.093418 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:02Z","lastTransitionTime":"2026-01-21T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.196412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.196458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.196473 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.196490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.196503 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:02Z","lastTransitionTime":"2026-01-21T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.299179 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.299249 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.299268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.299298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.299319 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:02Z","lastTransitionTime":"2026-01-21T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.342271 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 15:44:13.650011175 +0000 UTC Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.383685 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:02 crc kubenswrapper[4773]: E0121 15:25:02.383884 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.383995 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:02 crc kubenswrapper[4773]: E0121 15:25:02.384186 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.402007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.402042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.402051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.402067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.402078 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:02Z","lastTransitionTime":"2026-01-21T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.504102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.504136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.504145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.504158 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.504167 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:02Z","lastTransitionTime":"2026-01-21T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.606440 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.606484 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.606496 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.606515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.606529 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:02Z","lastTransitionTime":"2026-01-21T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.708802 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.708851 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.708866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.708888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.708903 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:02Z","lastTransitionTime":"2026-01-21T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.811361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.811406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.811416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.811432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.811441 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:02Z","lastTransitionTime":"2026-01-21T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.913205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.913252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.913262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.913278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:02 crc kubenswrapper[4773]: I0121 15:25:02.913290 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:02Z","lastTransitionTime":"2026-01-21T15:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.016014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.016103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.016122 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.016160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.016172 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:03Z","lastTransitionTime":"2026-01-21T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.119117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.119182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.119200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.119243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.119261 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:03Z","lastTransitionTime":"2026-01-21T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.221566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.221612 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.221625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.221643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.221656 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:03Z","lastTransitionTime":"2026-01-21T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.324463 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.324508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.324520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.324539 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.324552 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:03Z","lastTransitionTime":"2026-01-21T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.342895 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 21:29:57.998074519 +0000 UTC Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.383426 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.383553 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:03 crc kubenswrapper[4773]: E0121 15:25:03.383596 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:25:03 crc kubenswrapper[4773]: E0121 15:25:03.383720 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.427636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.427671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.427681 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.427736 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.427749 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:03Z","lastTransitionTime":"2026-01-21T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.530416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.530472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.530484 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.530500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.530510 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:03Z","lastTransitionTime":"2026-01-21T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.632862 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.632930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.632949 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.632976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.632993 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:03Z","lastTransitionTime":"2026-01-21T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.735917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.735964 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.735978 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.735997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.736011 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:03Z","lastTransitionTime":"2026-01-21T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.837947 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.837989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.838000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.838017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.838027 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:03Z","lastTransitionTime":"2026-01-21T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.940742 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.940790 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.940803 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.940825 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:03 crc kubenswrapper[4773]: I0121 15:25:03.940840 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:03Z","lastTransitionTime":"2026-01-21T15:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.043158 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.043210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.043225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.043244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.043258 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:04Z","lastTransitionTime":"2026-01-21T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.145642 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.145713 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.145726 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.145745 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.145762 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:04Z","lastTransitionTime":"2026-01-21T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.248850 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.248902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.248913 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.248931 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.248942 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:04Z","lastTransitionTime":"2026-01-21T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.343776 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 18:20:21.573345158 +0000 UTC Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.352253 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.352304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.352318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.352341 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.352357 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:04Z","lastTransitionTime":"2026-01-21T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.383197 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.383560 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:04 crc kubenswrapper[4773]: E0121 15:25:04.383747 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:25:04 crc kubenswrapper[4773]: E0121 15:25:04.383962 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.384243 4773 scope.go:117] "RemoveContainer" containerID="40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0" Jan 21 15:25:04 crc kubenswrapper[4773]: E0121 15:25:04.384518 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.455039 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.455082 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.455095 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.455112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.455123 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:04Z","lastTransitionTime":"2026-01-21T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.557587 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.557664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.557686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.557735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.557751 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:04Z","lastTransitionTime":"2026-01-21T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.660353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.660387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.660396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.660412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.660424 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:04Z","lastTransitionTime":"2026-01-21T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.763634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.763741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.763763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.763789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.763807 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:04Z","lastTransitionTime":"2026-01-21T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.866203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.866265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.866277 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.866297 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.866308 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:04Z","lastTransitionTime":"2026-01-21T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.968783 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.968834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.968842 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.968859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:04 crc kubenswrapper[4773]: I0121 15:25:04.968870 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:04Z","lastTransitionTime":"2026-01-21T15:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.072414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.072464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.072474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.072495 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.072507 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:05Z","lastTransitionTime":"2026-01-21T15:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.176361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.176456 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.176479 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.176506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.176524 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:05Z","lastTransitionTime":"2026-01-21T15:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.279382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.279442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.279455 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.279478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.279495 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:05Z","lastTransitionTime":"2026-01-21T15:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.344962 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 09:11:49.096127016 +0000 UTC Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.382654 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.382725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.382738 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.382759 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.382771 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:05Z","lastTransitionTime":"2026-01-21T15:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.383416 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.383516 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:05 crc kubenswrapper[4773]: E0121 15:25:05.383561 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:25:05 crc kubenswrapper[4773]: E0121 15:25:05.383712 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.398177 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.413766 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.433187 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.447417 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.463200 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.485875 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.485954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.485969 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.485993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.486024 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:05Z","lastTransitionTime":"2026-01-21T15:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.485805 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:48Z\\\",\\\"message\\\":\\\"rrent time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z]\\\\nI0121 15:24:48.227871 6438 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf\\\\nI0121 15:24:48.229971 6438 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:24:48.229998 6438 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.512647 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.526778 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abc7919e-a9f2-4e74-84b2-d45532f88119\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7177d316f870571dc5481c145d33cb7045938fe0537f51543f83991cdc44ebd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a91e99071784df2b01fef79a6d8bbcef3db7b0f853f6523009acfc83c23e2c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c0a968ae28c1f228af6e4ee3cd2eed0a13b2398efe4cbab048c6e973746924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.541903 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.555815 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.570111 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8n66g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a01fed4-2691-453e-b74f-c000d5125b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8n66g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.585960 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.589818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.589856 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.589868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.589889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.589903 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:05Z","lastTransitionTime":"2026-01-21T15:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.599452 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.614349 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.632107 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.647200 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.659576 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.671865 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:05Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.692780 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.692829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.692841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.692862 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.692876 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:05Z","lastTransitionTime":"2026-01-21T15:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.796300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.796352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.796364 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.796384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.796397 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:05Z","lastTransitionTime":"2026-01-21T15:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.834212 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:05 crc kubenswrapper[4773]: E0121 15:25:05.834361 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:25:05 crc kubenswrapper[4773]: E0121 15:25:05.834428 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs podName:1a01fed4-2691-453e-b74f-c000d5125b53 nodeName:}" failed. No retries permitted until 2026-01-21 15:25:37.834409699 +0000 UTC m=+102.758899321 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs") pod "network-metrics-daemon-8n66g" (UID: "1a01fed4-2691-453e-b74f-c000d5125b53") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.898970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.899012 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.899027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.899046 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:05 crc kubenswrapper[4773]: I0121 15:25:05.899061 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:05Z","lastTransitionTime":"2026-01-21T15:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.001816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.001860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.001875 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.001891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.001903 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:06Z","lastTransitionTime":"2026-01-21T15:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.104762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.104811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.104827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.104847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.104863 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:06Z","lastTransitionTime":"2026-01-21T15:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.207663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.207736 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.207748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.207764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.207774 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:06Z","lastTransitionTime":"2026-01-21T15:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.310480 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.310530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.310542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.310564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.310577 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:06Z","lastTransitionTime":"2026-01-21T15:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.345123 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 03:12:28.551640658 +0000 UTC Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.382799 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.382796 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:06 crc kubenswrapper[4773]: E0121 15:25:06.382926 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:25:06 crc kubenswrapper[4773]: E0121 15:25:06.383058 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.413262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.413323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.413333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.413368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.413379 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:06Z","lastTransitionTime":"2026-01-21T15:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.516527 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.516580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.516597 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.516618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.516633 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:06Z","lastTransitionTime":"2026-01-21T15:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.619224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.619263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.619277 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.619292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.619301 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:06Z","lastTransitionTime":"2026-01-21T15:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.722169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.722204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.722213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.722226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.722237 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:06Z","lastTransitionTime":"2026-01-21T15:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.825137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.825191 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.825207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.825229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.825242 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:06Z","lastTransitionTime":"2026-01-21T15:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.927766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.927806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.927818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.927839 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:06 crc kubenswrapper[4773]: I0121 15:25:06.927850 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:06Z","lastTransitionTime":"2026-01-21T15:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.030261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.030304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.030327 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.030347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.030358 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.133733 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.133811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.133824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.133913 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.133925 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.236818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.237063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.237087 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.237118 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.237144 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.339861 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.339917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.339930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.339957 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.339970 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.346162 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 13:01:55.339284478 +0000 UTC Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.383777 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.383789 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:07 crc kubenswrapper[4773]: E0121 15:25:07.383968 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:25:07 crc kubenswrapper[4773]: E0121 15:25:07.384078 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.429487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.429540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.429553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.429593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.429604 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: E0121 15:25:07.443393 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.447371 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.447406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.447416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.447432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.447443 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: E0121 15:25:07.459404 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.462156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.462199 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.462210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.462221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.462229 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: E0121 15:25:07.473905 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.476922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.476951 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.476961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.476976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.476986 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: E0121 15:25:07.487989 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.490648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.490686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.490721 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.490739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.490750 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: E0121 15:25:07.502470 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6f2b8693-eb75-45a0-8d77-3f0db13277ea\\\",\\\"systemUUID\\\":\\\"b60d999f-1a4a-45e9-ae91-551ff743d8e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: E0121 15:25:07.502591 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.504067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.504105 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.504116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.504133 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.504146 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.606847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.606903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.606916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.606933 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.606945 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.709762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.709800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.709809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.709827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.709837 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.735752 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gc5wj_34d54fdd-eda0-441f-b721-0adecc20a0db/kube-multus/0.log" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.735798 4773 generic.go:334] "Generic (PLEG): container finished" podID="34d54fdd-eda0-441f-b721-0adecc20a0db" containerID="e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b" exitCode=1 Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.735827 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gc5wj" event={"ID":"34d54fdd-eda0-441f-b721-0adecc20a0db","Type":"ContainerDied","Data":"e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b"} Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.736219 4773 scope.go:117] "RemoveContainer" containerID="e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.751855 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.764661 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.777672 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.788569 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.799228 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.808155 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.812395 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.812419 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.812427 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.812441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.812450 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.818766 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.828762 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.840409 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.853053 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.864036 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.877434 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.895826 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:48Z\\\",\\\"message\\\":\\\"rrent time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z]\\\\nI0121 15:24:48.227871 6438 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf\\\\nI0121 15:24:48.229971 6438 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:24:48.229998 6438 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.914295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.914580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.914672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.914806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.914898 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:07Z","lastTransitionTime":"2026-01-21T15:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.917181 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.928766 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abc7919e-a9f2-4e74-84b2-d45532f88119\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7177d316f870571dc5481c145d33cb7045938fe0537f51543f83991cdc44ebd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a91e99071784df2b01fef79a6d8bbcef3db7b0f853f6523009acfc83c23e2c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c0a968ae28c1f228af6e4ee3cd2eed0a13b2398efe4cbab048c6e973746924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.943899 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.956991 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:25:06Z\\\",\\\"message\\\":\\\"2026-01-21T15:24:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d8a5b5b3-d0c7-4bd5-91f4-f40968851357\\\\n2026-01-21T15:24:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d8a5b5b3-d0c7-4bd5-91f4-f40968851357 to /host/opt/cni/bin/\\\\n2026-01-21T15:24:21Z [verbose] multus-daemon started\\\\n2026-01-21T15:24:21Z [verbose] Readiness Indicator file check\\\\n2026-01-21T15:25:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:07 crc kubenswrapper[4773]: I0121 15:25:07.966293 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8n66g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a01fed4-2691-453e-b74f-c000d5125b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8n66g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:07Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.017006 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.017265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.017376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.017472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.017562 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:08Z","lastTransitionTime":"2026-01-21T15:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.122442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.122622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.122719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.122802 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.122877 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:08Z","lastTransitionTime":"2026-01-21T15:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.225751 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.225836 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.225851 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.225895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.225912 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:08Z","lastTransitionTime":"2026-01-21T15:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.328959 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.329252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.329329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.329398 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.329481 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:08Z","lastTransitionTime":"2026-01-21T15:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.346338 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 20:03:15.308708323 +0000 UTC Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.383720 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:08 crc kubenswrapper[4773]: E0121 15:25:08.384200 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.383685 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:08 crc kubenswrapper[4773]: E0121 15:25:08.384476 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.432630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.432686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.432732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.432757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.432774 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:08Z","lastTransitionTime":"2026-01-21T15:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.535608 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.535653 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.535662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.535684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.535713 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:08Z","lastTransitionTime":"2026-01-21T15:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.638027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.638073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.638083 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.638099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.638111 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:08Z","lastTransitionTime":"2026-01-21T15:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.741427 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.741482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.741500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.741524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.741542 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:08Z","lastTransitionTime":"2026-01-21T15:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.744221 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gc5wj_34d54fdd-eda0-441f-b721-0adecc20a0db/kube-multus/0.log" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.744300 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gc5wj" event={"ID":"34d54fdd-eda0-441f-b721-0adecc20a0db","Type":"ContainerStarted","Data":"85809c36839dac071be64acdad8a32525bf32b4586611e5ff9424305ca3f8e9b"} Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.762277 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gc5wj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d54fdd-eda0-441f-b721-0adecc20a0db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85809c36839dac071be64acdad8a32525bf32b4586611e5ff9424305ca3f8e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:25:06Z\\\",\\\"message\\\":\\\"2026-01-21T15:24:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d8a5b5b3-d0c7-4bd5-91f4-f40968851357\\\\n2026-01-21T15:24:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d8a5b5b3-d0c7-4bd5-91f4-f40968851357 to /host/opt/cni/bin/\\\\n2026-01-21T15:24:21Z [verbose] multus-daemon started\\\\n2026-01-21T15:24:21Z [verbose] Readiness Indicator file check\\\\n2026-01-21T15:25:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c298s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gc5wj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.777438 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8n66g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a01fed4-2691-453e-b74f-c000d5125b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms2mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8n66g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.804267 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.816397 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abc7919e-a9f2-4e74-84b2-d45532f88119\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7177d316f870571dc5481c145d33cb7045938fe0537f51543f83991cdc44ebd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a91e99071784df2b01fef79a6d8bbcef3db7b0f853f6523009acfc83c23e2c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c0a968ae28c1f228af6e4ee3cd2eed0a13b2398efe4cbab048c6e973746924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.828182 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.840639 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7ee75869e03f6737eaa7085e11f00ebdbba1ff8053fe155284733d92a93ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.843616 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.843646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.843657 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.843723 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.843766 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:08Z","lastTransitionTime":"2026-01-21T15:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.851304 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fc6a067f75e3fa47a0005f8a7deae24bf6b5ae940f930c03d5d4204d88a1e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.860744 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t2rrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16f53932-e395-43b0-a347-69ada1fe11a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4fa8c675c42aa114ed8932508698fa47edb12a50bc4c4af25c4ff3e20fda816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nk7s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t2rrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.870516 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84731e4-6a51-4d0f-9a8b-3cfcf8b4738b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f1647f71db8dd0292eb7755678807d0d3591e5d7695da5bd8b2f7c13cf95f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abbd4fa221e8904f030aa869439d3e3ecf33cb980eaa1252993f538f6889db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6fgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnwxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.883518 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef2ac3-d17d-45d6-8cba-f0ab6da6b120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0121 15:24:07.637962 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 15:24:07.639123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-668274627/tls.crt::/tmp/serving-cert-668274627/tls.key\\\\\\\"\\\\nI0121 15:24:13.063168 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 15:24:13.065631 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 15:24:13.065654 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 15:24:13.065672 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 15:24:13.065679 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 15:24:13.070378 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0121 15:24:13.070401 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0121 15:24:13.070407 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070412 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 15:24:13.070417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 15:24:13.070422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 15:24:13.070425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 15:24:13.070429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0121 15:24:13.071798 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.895955 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d081e8bb-02cd-40ed-b717-f553de5feb44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f89402de6fa2a6f00952bcadc69db7a53cf3b1602eaab4e79efd49774b47cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4a2d00d1626dfc0448eb51f18c0642eafc4c40fd19116201dea004879160d4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b18a452070429a986abe3a1d6adbbaad6341ca377d5db2081430a7023d9a0df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.906354 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.923038 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6f67j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c54fc642e6a1dc584c96a204db9f8b10219f7d38f250ac59ca52ef34cc7e7181\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f58cd1701b6b1c944f17ad806905193f0a26feec77c0696ceef3b9d302a6c567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d76c47557ba3d03ab33310a70ea57c452e80b0b278d495585474ed7b57ee6bef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8148e037487d0970acfdca08b52de72c9926058f860d9215ce1921202bbb88dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb44e6b9a2f21f9d8f147804d1487b7438472caf0ff62ff7f8433a03bcec414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a878970f42227c10f7aad40a287e54a12de554918d6c831fadfdae1c1f98b346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d3878109e58aea63423d700ff12423535c46ce61099663ad47c4ad56f25689f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g25hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6f67j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.940821 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d23a5a4-6787-45a5-9664-20318156f46f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T15:24:48Z\\\",\\\"message\\\":\\\"rrent time 2026-01-21T15:24:48Z is after 2025-08-24T17:21:41Z]\\\\nI0121 15:24:48.227871 6438 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf\\\\nI0121 15:24:48.229971 6438 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 15:24:48.229998 6438 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-94hkt_openshift-ovn-kubernetes(2d23a5a4-6787-45a5-9664-20318156f46f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9drtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-94hkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.946626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.946677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.946688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.946722 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.946734 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:08Z","lastTransitionTime":"2026-01-21T15:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.953993 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mnvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb8f1575-e899-4d82-ad55-696e10474bf8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://defc4ef9c03d80fd1273053057a1f2b373935f558ba991587c0f4f1f2c78c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2hdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mnvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.967992 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.981408 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b76f3602f9bc24b24ad59fc62c87012f0acd3f7cb54da9948817e004f8e186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d37ada8bfa2c6ff5a8232c953c0a46450cfc21cb4745d54928bcc53175e7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:08 crc kubenswrapper[4773]: I0121 15:25:08.993964 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dff586d5-9d98-4ec2-afb1-e550fd4f3678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58cd0b141d18d52d8b0d62730ecc25d0422b81dfcc43b7a65b8392d0280e9312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:24:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srxc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:24:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rfzvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:08Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.048881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.048934 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.048946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.048966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.048978 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:09Z","lastTransitionTime":"2026-01-21T15:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.151557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.151591 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.151601 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.151617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.151627 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:09Z","lastTransitionTime":"2026-01-21T15:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.254380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.254428 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.254439 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.254458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.254470 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:09Z","lastTransitionTime":"2026-01-21T15:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.346923 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 19:19:03.22174593 +0000 UTC Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.356791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.356824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.356834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.356848 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.356862 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:09Z","lastTransitionTime":"2026-01-21T15:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.383329 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.383349 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:09 crc kubenswrapper[4773]: E0121 15:25:09.383475 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:25:09 crc kubenswrapper[4773]: E0121 15:25:09.383522 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.460244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.460305 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.460324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.460348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.460366 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:09Z","lastTransitionTime":"2026-01-21T15:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.563216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.563263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.563275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.563296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.563307 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:09Z","lastTransitionTime":"2026-01-21T15:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.666597 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.666656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.666674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.666730 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.666747 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:09Z","lastTransitionTime":"2026-01-21T15:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.769568 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.769604 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.769617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.769634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.769645 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:09Z","lastTransitionTime":"2026-01-21T15:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.871516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.871575 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.871588 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.871607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.871620 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:09Z","lastTransitionTime":"2026-01-21T15:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.973946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.973990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.973999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.974015 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:09 crc kubenswrapper[4773]: I0121 15:25:09.974025 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:09Z","lastTransitionTime":"2026-01-21T15:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.076090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.076142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.076155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.076175 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.076190 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:10Z","lastTransitionTime":"2026-01-21T15:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.178854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.178949 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.178968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.178999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.179016 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:10Z","lastTransitionTime":"2026-01-21T15:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.281779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.281823 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.281835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.281854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.281869 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:10Z","lastTransitionTime":"2026-01-21T15:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.347616 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 11:32:00.000601082 +0000 UTC Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.382972 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.383091 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:10 crc kubenswrapper[4773]: E0121 15:25:10.383302 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:25:10 crc kubenswrapper[4773]: E0121 15:25:10.383373 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.384447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.384517 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.384543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.384575 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.384602 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:10Z","lastTransitionTime":"2026-01-21T15:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.486900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.486955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.486966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.486990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.487004 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:10Z","lastTransitionTime":"2026-01-21T15:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.589665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.589729 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.589741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.589762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.589772 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:10Z","lastTransitionTime":"2026-01-21T15:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.692598 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.692658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.692670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.692704 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.692716 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:10Z","lastTransitionTime":"2026-01-21T15:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.794971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.795028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.795044 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.795064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.795078 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:10Z","lastTransitionTime":"2026-01-21T15:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.897624 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.897660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.897670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.897687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:10 crc kubenswrapper[4773]: I0121 15:25:10.897719 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:10Z","lastTransitionTime":"2026-01-21T15:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.000591 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.000647 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.000663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.000687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.000748 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:11Z","lastTransitionTime":"2026-01-21T15:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.103167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.103243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.103266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.103296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.103321 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:11Z","lastTransitionTime":"2026-01-21T15:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.206094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.206158 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.206174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.206200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.206218 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:11Z","lastTransitionTime":"2026-01-21T15:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.308077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.308174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.308187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.308206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.308217 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:11Z","lastTransitionTime":"2026-01-21T15:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.348586 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 00:04:08.422995114 +0000 UTC Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.383490 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:11 crc kubenswrapper[4773]: E0121 15:25:11.383829 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.383885 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:11 crc kubenswrapper[4773]: E0121 15:25:11.384031 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.411434 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.411515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.411535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.411562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.411581 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:11Z","lastTransitionTime":"2026-01-21T15:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.514516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.514578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.514592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.514612 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.514625 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:11Z","lastTransitionTime":"2026-01-21T15:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.617539 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.617622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.617646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.617677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.617734 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:11Z","lastTransitionTime":"2026-01-21T15:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.720666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.720743 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.720759 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.720777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.720789 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:11Z","lastTransitionTime":"2026-01-21T15:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.823991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.824038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.824050 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.824066 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.824080 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:11Z","lastTransitionTime":"2026-01-21T15:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.928568 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.929024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.929121 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.929222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:11 crc kubenswrapper[4773]: I0121 15:25:11.929307 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:11Z","lastTransitionTime":"2026-01-21T15:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.032253 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.032294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.032306 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.032325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.032338 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:12Z","lastTransitionTime":"2026-01-21T15:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.137370 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.137911 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.137950 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.137988 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.138005 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:12Z","lastTransitionTime":"2026-01-21T15:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.239860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.239922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.239932 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.239946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.239955 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:12Z","lastTransitionTime":"2026-01-21T15:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.342644 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.342687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.342740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.342767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.342782 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:12Z","lastTransitionTime":"2026-01-21T15:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.349264 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 02:52:06.059905948 +0000 UTC Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.383839 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.383839 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:12 crc kubenswrapper[4773]: E0121 15:25:12.384324 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:25:12 crc kubenswrapper[4773]: E0121 15:25:12.384510 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.446014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.446089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.446127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.446159 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.446182 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:12Z","lastTransitionTime":"2026-01-21T15:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.549752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.549832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.549869 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.549902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.549924 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:12Z","lastTransitionTime":"2026-01-21T15:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.653076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.653132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.653142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.653160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.653172 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:12Z","lastTransitionTime":"2026-01-21T15:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.756159 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.756199 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.756210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.756226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.756241 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:12Z","lastTransitionTime":"2026-01-21T15:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.860512 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.860567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.860578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.860597 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.860612 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:12Z","lastTransitionTime":"2026-01-21T15:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.963565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.963628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.963644 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.963669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:12 crc kubenswrapper[4773]: I0121 15:25:12.963688 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:12Z","lastTransitionTime":"2026-01-21T15:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.066967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.067014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.067023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.067041 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.067051 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:13Z","lastTransitionTime":"2026-01-21T15:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.169415 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.169492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.169509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.169535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.169552 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:13Z","lastTransitionTime":"2026-01-21T15:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.272600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.272679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.272743 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.272781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.272806 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:13Z","lastTransitionTime":"2026-01-21T15:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.349761 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 08:25:50.281930415 +0000 UTC Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.376266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.376816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.376825 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.376840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.376850 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:13Z","lastTransitionTime":"2026-01-21T15:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.382912 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:13 crc kubenswrapper[4773]: E0121 15:25:13.383016 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.383201 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:13 crc kubenswrapper[4773]: E0121 15:25:13.383283 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.479587 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.479625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.479635 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.479651 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.479662 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:13Z","lastTransitionTime":"2026-01-21T15:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.582263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.582337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.582363 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.582405 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.582432 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:13Z","lastTransitionTime":"2026-01-21T15:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.685282 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.685421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.685434 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.685453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.685464 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:13Z","lastTransitionTime":"2026-01-21T15:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.788859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.788924 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.788935 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.788956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.788967 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:13Z","lastTransitionTime":"2026-01-21T15:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.893347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.893407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.893423 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.893447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.893463 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:13Z","lastTransitionTime":"2026-01-21T15:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.996653 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.996746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.996763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.996786 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:13 crc kubenswrapper[4773]: I0121 15:25:13.996802 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:13Z","lastTransitionTime":"2026-01-21T15:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.099837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.099892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.099902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.099923 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.099935 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:14Z","lastTransitionTime":"2026-01-21T15:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.202975 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.203040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.203054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.203106 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.203123 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:14Z","lastTransitionTime":"2026-01-21T15:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.307249 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.307285 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.307296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.307312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.307324 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:14Z","lastTransitionTime":"2026-01-21T15:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.350896 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 23:33:42.76140084 +0000 UTC Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.383271 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.383384 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:14 crc kubenswrapper[4773]: E0121 15:25:14.383458 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:25:14 crc kubenswrapper[4773]: E0121 15:25:14.383611 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.410284 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.410369 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.410390 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.410418 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.410444 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:14Z","lastTransitionTime":"2026-01-21T15:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.513738 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.513797 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.513812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.513831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.513844 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:14Z","lastTransitionTime":"2026-01-21T15:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.615865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.615930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.615945 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.615964 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.615976 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:14Z","lastTransitionTime":"2026-01-21T15:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.718580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.718638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.718651 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.718671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.718689 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:14Z","lastTransitionTime":"2026-01-21T15:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.821447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.821968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.822102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.822254 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.822394 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:14Z","lastTransitionTime":"2026-01-21T15:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.926837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.926905 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.926920 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.926943 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:14 crc kubenswrapper[4773]: I0121 15:25:14.926959 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:14Z","lastTransitionTime":"2026-01-21T15:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.029593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.029633 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.029643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.029659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.029669 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:15Z","lastTransitionTime":"2026-01-21T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.133490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.133574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.133594 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.133620 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.133638 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:15Z","lastTransitionTime":"2026-01-21T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.237524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.237613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.237636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.237670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.237728 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:15Z","lastTransitionTime":"2026-01-21T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.341277 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.341342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.341378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.341420 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.341444 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:15Z","lastTransitionTime":"2026-01-21T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.351835 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 11:28:36.1208253 +0000 UTC Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.383617 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:15 crc kubenswrapper[4773]: E0121 15:25:15.383912 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.383617 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:15 crc kubenswrapper[4773]: E0121 15:25:15.384170 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.408669 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c49f8543-ced3-43e4-a130-05f586b8bfea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://835e012e83dbfe56e1f0e6770d354ad217c8377f7714dc8159dd74cf56fd8247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3950a560ab23173e8a8d3eaf4eb54a56370391768193885d6cc96d12aca2e602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fbe0916d8b2925c52f34b379cddfbe931626c72a6746953fd9ca10e61111b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7aa70929abee16beceb67f58502d800b62ae5a85b55cf297ee407c2b0087da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c9bfeec07e90e2bbe71afce27a401fd7f447504c99f44d8c7d98e0703a7d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e78e3fd8e3f87391b301b09dca048afe1b306fa69b345663ebc2c78d448439c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee8a0fea475b8efe8923a72179dce5308d238471432b9cf43925556b0aca2276\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25f020ace7272892dbcc8961c4531abdb0943649a8c3a671b96ccf26ca844403\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.422331 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abc7919e-a9f2-4e74-84b2-d45532f88119\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T15:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7177d316f870571dc5481c145d33cb7045938fe0537f51543f83991cdc44ebd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a91e99071784df2b01fef79a6d8bbcef3db7b0f853f6523009acfc83c23e2c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c0a968ae28c1f228af6e4ee3cd2eed0a13b2398efe4cbab048c6e973746924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65587a782c152b64ede27c3effebef6966a53db27191bd9e9b6cc09c997b585c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T15:23:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T15:23:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T15:23:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.433744 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T15:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T15:25:15Z is after 2025-08-24T17:21:41Z" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.444744 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.444781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.444793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.444812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.444825 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:15Z","lastTransitionTime":"2026-01-21T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.470294 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gc5wj" podStartSLOduration=56.470272369 podStartE2EDuration="56.470272369s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:15.453862651 +0000 UTC m=+80.378352283" watchObservedRunningTime="2026-01-21 15:25:15.470272369 +0000 UTC m=+80.394761991" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.531143 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=62.531116344 podStartE2EDuration="1m2.531116344s" podCreationTimestamp="2026-01-21 15:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:15.517329375 +0000 UTC m=+80.441818997" watchObservedRunningTime="2026-01-21 15:25:15.531116344 +0000 UTC m=+80.455605966" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.548025 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.548078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.548089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.548111 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.548127 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:15Z","lastTransitionTime":"2026-01-21T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.548571 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=59.548541189 podStartE2EDuration="59.548541189s" podCreationTimestamp="2026-01-21 15:24:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:15.530968741 +0000 UTC m=+80.455458373" watchObservedRunningTime="2026-01-21 15:25:15.548541189 +0000 UTC m=+80.473030821" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.559779 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t2rrh" podStartSLOduration=56.559759542 podStartE2EDuration="56.559759542s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:15.5596773 +0000 UTC m=+80.484166922" watchObservedRunningTime="2026-01-21 15:25:15.559759542 +0000 UTC m=+80.484249184" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.591629 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnwxf" podStartSLOduration=55.591603632 podStartE2EDuration="55.591603632s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:15.577101213 +0000 UTC m=+80.501590845" watchObservedRunningTime="2026-01-21 15:25:15.591603632 +0000 UTC m=+80.516093254" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.625032 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podStartSLOduration=56.624999222 podStartE2EDuration="56.624999222s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:15.619052437 +0000 UTC m=+80.543542069" watchObservedRunningTime="2026-01-21 15:25:15.624999222 +0000 UTC m=+80.549488844" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.653308 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.653338 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.653346 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.653362 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.653372 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:15Z","lastTransitionTime":"2026-01-21T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.680125 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6f67j" podStartSLOduration=56.680098029 podStartE2EDuration="56.680098029s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:15.650095366 +0000 UTC m=+80.574584988" watchObservedRunningTime="2026-01-21 15:25:15.680098029 +0000 UTC m=+80.604587651" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.691085 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mnvdf" podStartSLOduration=56.691047424 podStartE2EDuration="56.691047424s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:15.691038944 +0000 UTC m=+80.615528566" watchObservedRunningTime="2026-01-21 15:25:15.691047424 +0000 UTC m=+80.615537066" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.755490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.755560 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.755574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.755602 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.755623 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:15Z","lastTransitionTime":"2026-01-21T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.858970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.859024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.859035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.859053 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.859063 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:15Z","lastTransitionTime":"2026-01-21T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.962614 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.962671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.962722 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.962751 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:15 crc kubenswrapper[4773]: I0121 15:25:15.962772 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:15Z","lastTransitionTime":"2026-01-21T15:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.066729 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.066806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.066823 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.066854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.066874 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:16Z","lastTransitionTime":"2026-01-21T15:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.170279 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.170321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.170334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.170353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.170366 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:16Z","lastTransitionTime":"2026-01-21T15:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.273389 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.273452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.273464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.273486 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.273502 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:16Z","lastTransitionTime":"2026-01-21T15:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.352745 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 06:53:16.955698244 +0000 UTC Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.376114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.376182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.376196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.376216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.376230 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:16Z","lastTransitionTime":"2026-01-21T15:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.382862 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.382880 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:16 crc kubenswrapper[4773]: E0121 15:25:16.383044 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:25:16 crc kubenswrapper[4773]: E0121 15:25:16.383389 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.383794 4773 scope.go:117] "RemoveContainer" containerID="40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.479173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.479663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.479678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.479722 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.479739 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:16Z","lastTransitionTime":"2026-01-21T15:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.582671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.582783 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.582801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.582827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.582846 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:16Z","lastTransitionTime":"2026-01-21T15:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.685482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.685531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.685547 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.685567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.685584 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:16Z","lastTransitionTime":"2026-01-21T15:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.788621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.788684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.788731 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.788757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.788776 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:16Z","lastTransitionTime":"2026-01-21T15:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.892262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.892313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.892325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.892343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.892356 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:16Z","lastTransitionTime":"2026-01-21T15:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.995887 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.995926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.995936 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.995953 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:16 crc kubenswrapper[4773]: I0121 15:25:16.995963 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:16Z","lastTransitionTime":"2026-01-21T15:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.098230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.098278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.098292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.098312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.098325 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:17Z","lastTransitionTime":"2026-01-21T15:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.203615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.204297 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.204321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.204345 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.204359 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:17Z","lastTransitionTime":"2026-01-21T15:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.261554 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.261819 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.262072 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:26:21.262048058 +0000 UTC m=+146.186537680 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.262070 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.262128 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.262149 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.262233 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 15:26:21.262209612 +0000 UTC m=+146.186699254 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.308203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.308251 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.308262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.308281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.308296 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:17Z","lastTransitionTime":"2026-01-21T15:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.353635 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 20:21:18.952439893 +0000 UTC Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.363184 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.363284 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.363309 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.363327 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.363431 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:26:21.36340734 +0000 UTC m=+146.287896962 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.363471 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.363490 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.363504 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.363569 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 15:26:21.363552373 +0000 UTC m=+146.288041995 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.363603 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.363726 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 15:26:21.363687277 +0000 UTC m=+146.288176899 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.383128 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.383192 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.383274 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.383414 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.410460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.410905 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.411034 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.411143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.411319 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:17Z","lastTransitionTime":"2026-01-21T15:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.514010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.514321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.514393 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.514456 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.514512 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:17Z","lastTransitionTime":"2026-01-21T15:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.616506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.616540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.616548 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.616561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.616570 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:17Z","lastTransitionTime":"2026-01-21T15:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.719523 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.719561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.719571 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.719586 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.719596 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:17Z","lastTransitionTime":"2026-01-21T15:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.746274 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8n66g"] Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.746639 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:17 crc kubenswrapper[4773]: E0121 15:25:17.746834 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.780902 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovnkube-controller/2.log" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.783433 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerStarted","Data":"f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74"} Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.783897 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.809434 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=64.809412197 podStartE2EDuration="1m4.809412197s" podCreationTimestamp="2026-01-21 15:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:17.806911851 +0000 UTC m=+82.731401473" watchObservedRunningTime="2026-01-21 15:25:17.809412197 +0000 UTC m=+82.733901819" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.819951 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=27.81992788 podStartE2EDuration="27.81992788s" podCreationTimestamp="2026-01-21 15:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:17.818541684 +0000 UTC m=+82.743031306" watchObservedRunningTime="2026-01-21 15:25:17.81992788 +0000 UTC m=+82.744417502" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.821375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.821411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.821421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.821437 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.821447 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:17Z","lastTransitionTime":"2026-01-21T15:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.855123 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podStartSLOduration=57.855097007 podStartE2EDuration="57.855097007s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:17.854991324 +0000 UTC m=+82.779481026" watchObservedRunningTime="2026-01-21 15:25:17.855097007 +0000 UTC m=+82.779586629" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.889666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.889784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.889809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.889855 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.889875 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T15:25:17Z","lastTransitionTime":"2026-01-21T15:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.960988 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx"] Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.961364 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.963519 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.963675 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.964026 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 21 15:25:17 crc kubenswrapper[4773]: I0121 15:25:17.965285 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.071396 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d2040034-79e5-4e7a-bf33-03aab7184434-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.071460 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2040034-79e5-4e7a-bf33-03aab7184434-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.071490 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2040034-79e5-4e7a-bf33-03aab7184434-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.071536 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2040034-79e5-4e7a-bf33-03aab7184434-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.071559 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d2040034-79e5-4e7a-bf33-03aab7184434-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.172319 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2040034-79e5-4e7a-bf33-03aab7184434-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.172367 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2040034-79e5-4e7a-bf33-03aab7184434-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.172406 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2040034-79e5-4e7a-bf33-03aab7184434-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.172429 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d2040034-79e5-4e7a-bf33-03aab7184434-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.172491 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d2040034-79e5-4e7a-bf33-03aab7184434-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.172584 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d2040034-79e5-4e7a-bf33-03aab7184434-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.172584 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d2040034-79e5-4e7a-bf33-03aab7184434-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.173460 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2040034-79e5-4e7a-bf33-03aab7184434-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.177799 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2040034-79e5-4e7a-bf33-03aab7184434-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.189793 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2040034-79e5-4e7a-bf33-03aab7184434-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vkwgx\" (UID: \"d2040034-79e5-4e7a-bf33-03aab7184434\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.273109 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" Jan 21 15:25:18 crc kubenswrapper[4773]: W0121 15:25:18.291821 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2040034_79e5_4e7a_bf33_03aab7184434.slice/crio-d108818224ae7dc7c48af0a4d376d167874240ce05b06080e62b563a39c1815a WatchSource:0}: Error finding container d108818224ae7dc7c48af0a4d376d167874240ce05b06080e62b563a39c1815a: Status 404 returned error can't find the container with id d108818224ae7dc7c48af0a4d376d167874240ce05b06080e62b563a39c1815a Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.356808 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 02:57:00.864157464 +0000 UTC Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.356907 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.365266 4773 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.383058 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:18 crc kubenswrapper[4773]: E0121 15:25:18.383184 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:25:18 crc kubenswrapper[4773]: I0121 15:25:18.787062 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" event={"ID":"d2040034-79e5-4e7a-bf33-03aab7184434","Type":"ContainerStarted","Data":"d108818224ae7dc7c48af0a4d376d167874240ce05b06080e62b563a39c1815a"} Jan 21 15:25:19 crc kubenswrapper[4773]: I0121 15:25:19.383059 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:19 crc kubenswrapper[4773]: I0121 15:25:19.383059 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:19 crc kubenswrapper[4773]: E0121 15:25:19.383783 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 15:25:19 crc kubenswrapper[4773]: I0121 15:25:19.383101 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:19 crc kubenswrapper[4773]: E0121 15:25:19.384218 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8n66g" podUID="1a01fed4-2691-453e-b74f-c000d5125b53" Jan 21 15:25:19 crc kubenswrapper[4773]: E0121 15:25:19.383994 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 15:25:19 crc kubenswrapper[4773]: I0121 15:25:19.791323 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" event={"ID":"d2040034-79e5-4e7a-bf33-03aab7184434","Type":"ContainerStarted","Data":"cd3ed3170e0100497892b99e5126ba7f24927a06479a3edee36cf98d4f31e2ae"} Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.382723 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:20 crc kubenswrapper[4773]: E0121 15:25:20.382905 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.805108 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.805402 4773 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.850573 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vkwgx" podStartSLOduration=61.850534394 podStartE2EDuration="1m1.850534394s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:19.813199722 +0000 UTC m=+84.737689344" watchObservedRunningTime="2026-01-21 15:25:20.850534394 +0000 UTC m=+85.775024016" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.851815 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t6c6p"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.852627 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.853368 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-89m48"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.854201 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.856669 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.864945 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.865388 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.865380 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.866006 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.865452 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.867100 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.867547 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.867568 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.867772 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.868225 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lt258"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.868756 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.870075 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vvt6x"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.870489 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.871745 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d6phv"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.872151 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.872496 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-xwn45"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.873166 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.875164 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.877979 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-w5ls2"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.878311 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.878560 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.878872 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.879173 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-w5ls2" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.879342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.880120 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.880358 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.880361 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.884775 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.885510 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.885874 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.886338 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.889228 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.889573 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.889766 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.894742 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.895082 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.895340 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.895603 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.895776 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.896530 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jwjhn"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.896713 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.897051 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.897120 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.897202 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vf566"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.897530 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.897814 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.898428 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.898725 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.898780 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.898965 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.899086 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.899250 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.899623 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.899648 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.899662 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.899810 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.902338 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.902505 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.902771 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.903150 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.903314 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.903461 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.903638 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.904085 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.904192 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.904319 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.904979 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.905123 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.905246 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.905502 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.905663 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.906009 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.906302 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.906446 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.918133 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.921833 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569e4aab-6b67-4448-9e6e-ecab14ebc87e-config\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.921887 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb30bae7-1663-4658-acf9-f76adf8d12ea-serving-cert\") pod \"openshift-config-operator-7777fb866f-tnfh2\" (UID: \"bb30bae7-1663-4658-acf9-f76adf8d12ea\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.921952 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18d99fca-5145-431e-8bf1-8934b783b569-serving-cert\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.921976 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-client-ca\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.922018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/569e4aab-6b67-4448-9e6e-ecab14ebc87e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.922047 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-config\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.922066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bb30bae7-1663-4658-acf9-f76adf8d12ea-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tnfh2\" (UID: \"bb30bae7-1663-4658-acf9-f76adf8d12ea\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.922088 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.922107 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/569e4aab-6b67-4448-9e6e-ecab14ebc87e-images\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.922142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcr8p\" (UniqueName: \"kubernetes.io/projected/bb30bae7-1663-4658-acf9-f76adf8d12ea-kube-api-access-bcr8p\") pod \"openshift-config-operator-7777fb866f-tnfh2\" (UID: \"bb30bae7-1663-4658-acf9-f76adf8d12ea\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.922170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqbgs\" (UniqueName: \"kubernetes.io/projected/18d99fca-5145-431e-8bf1-8934b783b569-kube-api-access-qqbgs\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.922188 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wndbm\" (UniqueName: \"kubernetes.io/projected/569e4aab-6b67-4448-9e6e-ecab14ebc87e-kube-api-access-wndbm\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.923237 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.923571 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.923752 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.923997 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.924155 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.924335 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.924486 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.924666 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.924863 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.925152 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.925617 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.926536 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.928074 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.928597 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.928655 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.929463 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-f5jgd"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.929500 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.929773 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.929976 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.930236 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.930451 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.930873 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.931016 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.931290 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.931487 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.931743 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.932007 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.932072 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.932335 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.932457 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.932346 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.932528 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.932645 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.932744 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.932961 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.933139 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.933230 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.933405 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.933951 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.934515 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.936997 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.937116 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.937228 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.937637 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.937848 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.938280 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.938855 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.939321 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.939527 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.939843 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.939955 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.940050 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.940056 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.947286 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.954343 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.955583 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.975629 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.975929 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xrdf7"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.976704 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.976738 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.977168 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.977432 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.977774 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.977823 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.982151 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.989117 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.989330 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.991572 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.992914 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.994762 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.995228 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.995438 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.996128 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.996385 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg"] Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.996419 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.996906 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:20 crc kubenswrapper[4773]: I0121 15:25:20.998183 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.000329 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cb2vm"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.005961 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tmkjf"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.007266 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.007370 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.007271 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.008182 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.008325 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.009997 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.010095 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.010583 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4kntm"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.010916 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.010936 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.011015 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.012216 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-h26dv"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.017545 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.018018 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.019137 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.019385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.021453 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.023268 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.023740 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.027647 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-trusted-ca-bundle\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.027838 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.027872 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0bb536d4-f4ae-44ac-8477-0d14b97ebe04-profile-collector-cert\") pod \"olm-operator-6b444d44fb-68gt2\" (UID: \"0bb536d4-f4ae-44ac-8477-0d14b97ebe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.027949 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcr8p\" (UniqueName: \"kubernetes.io/projected/bb30bae7-1663-4658-acf9-f76adf8d12ea-kube-api-access-bcr8p\") pod \"openshift-config-operator-7777fb866f-tnfh2\" (UID: \"bb30bae7-1663-4658-acf9-f76adf8d12ea\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.028048 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dde388e0-c191-408f-a40e-72c1414c4d14-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fvd2g\" (UID: \"dde388e0-c191-408f-a40e-72c1414c4d14\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.028110 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-dir\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.028514 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0bb536d4-f4ae-44ac-8477-0d14b97ebe04-srv-cert\") pod \"olm-operator-6b444d44fb-68gt2\" (UID: \"0bb536d4-f4ae-44ac-8477-0d14b97ebe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.028576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b347bcd3-0e23-40a4-8e27-9140db184474-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-f5jgd\" (UID: \"b347bcd3-0e23-40a4-8e27-9140db184474\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.028587 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.028609 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdx7h\" (UniqueName: \"kubernetes.io/projected/b347bcd3-0e23-40a4-8e27-9140db184474-kube-api-access-xdx7h\") pod \"multus-admission-controller-857f4d67dd-f5jgd\" (UID: \"b347bcd3-0e23-40a4-8e27-9140db184474\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.028888 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zpcds"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029378 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-etcd-client\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029441 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqbgs\" (UniqueName: \"kubernetes.io/projected/18d99fca-5145-431e-8bf1-8934b783b569-kube-api-access-qqbgs\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029488 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wndbm\" (UniqueName: \"kubernetes.io/projected/569e4aab-6b67-4448-9e6e-ecab14ebc87e-kube-api-access-wndbm\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029523 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npn8\" (UniqueName: \"kubernetes.io/projected/fcb80de5-75af-4316-a192-3ffac092ffd9-kube-api-access-9npn8\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029561 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-node-pullsecrets\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029633 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-audit-dir\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029660 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-audit-policies\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029715 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8acf9922-28d9-410b-b416-6685314b9964-auth-proxy-config\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029751 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8acf9922-28d9-410b-b416-6685314b9964-machine-approver-tls\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029797 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-config\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029826 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxth\" (UniqueName: \"kubernetes.io/projected/75a6e760-8173-4942-a194-297cce124b98-kube-api-access-chxth\") pod \"downloads-7954f5f757-w5ls2\" (UID: \"75a6e760-8173-4942-a194-297cce124b98\") " pod="openshift-console/downloads-7954f5f757-w5ls2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.029954 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-service-ca\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030011 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-oauth-serving-cert\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030152 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-encryption-config\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030222 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030265 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8242f19f-45e0-4481-9d43-19305274878b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrkhf\" (UID: \"8242f19f-45e0-4481-9d43-19305274878b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030322 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcb80de5-75af-4316-a192-3ffac092ffd9-serving-cert\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030361 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-encryption-config\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030455 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030512 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjpdz\" (UniqueName: \"kubernetes.io/projected/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-kube-api-access-zjpdz\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030546 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-oauth-config\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030589 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8acf9922-28d9-410b-b416-6685314b9964-config\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030624 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde388e0-c191-408f-a40e-72c1414c4d14-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fvd2g\" (UID: \"dde388e0-c191-408f-a40e-72c1414c4d14\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030708 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e4506e-0adb-495a-b22d-ff5ac9e79afa-serving-cert\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030752 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-serving-cert\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030779 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2e950a56-b252-49e1-b795-4931be982e88-tmpfs\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030804 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2e950a56-b252-49e1-b795-4931be982e88-apiservice-cert\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030841 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569e4aab-6b67-4448-9e6e-ecab14ebc87e-config\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030868 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb30bae7-1663-4658-acf9-f76adf8d12ea-serving-cert\") pod \"openshift-config-operator-7777fb866f-tnfh2\" (UID: \"bb30bae7-1663-4658-acf9-f76adf8d12ea\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.030988 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-serving-cert\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031028 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-serving-cert\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031059 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031096 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xms9\" (UniqueName: \"kubernetes.io/projected/8acf9922-28d9-410b-b416-6685314b9964-kube-api-access-4xms9\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031124 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-image-import-ca\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031177 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-config\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031218 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-service-ca-bundle\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031246 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh86f\" (UniqueName: \"kubernetes.io/projected/eded1f09-fe44-4693-939c-60335f2d6b22-kube-api-access-zh86f\") pod \"dns-operator-744455d44c-jwjhn\" (UID: \"eded1f09-fe44-4693-939c-60335f2d6b22\") " pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031342 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69285d7a-0343-4bd8-a6e5-750bf8051c3e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031391 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb80de5-75af-4316-a192-3ffac092ffd9-config\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031417 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fcb80de5-75af-4316-a192-3ffac092ffd9-trusted-ca\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031460 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031522 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-policies\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.031556 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp89s\" (UniqueName: \"kubernetes.io/projected/0bb536d4-f4ae-44ac-8477-0d14b97ebe04-kube-api-access-gp89s\") pod \"olm-operator-6b444d44fb-68gt2\" (UID: \"0bb536d4-f4ae-44ac-8477-0d14b97ebe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032254 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032305 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-audit-dir\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032336 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52q5n\" (UniqueName: \"kubernetes.io/projected/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-kube-api-access-52q5n\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032365 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxp67\" (UniqueName: \"kubernetes.io/projected/ddeb61c7-7fde-4331-abe7-0dc69b173ee1-kube-api-access-rxp67\") pod \"cluster-samples-operator-665b6dd947-k25lw\" (UID: \"ddeb61c7-7fde-4331-abe7-0dc69b173ee1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032397 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd6lp\" (UniqueName: \"kubernetes.io/projected/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-kube-api-access-vd6lp\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032435 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032485 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032521 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69285d7a-0343-4bd8-a6e5-750bf8051c3e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032557 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-config\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032582 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smj6z\" (UniqueName: \"kubernetes.io/projected/dde388e0-c191-408f-a40e-72c1414c4d14-kube-api-access-smj6z\") pod \"openshift-controller-manager-operator-756b6f6bc6-fvd2g\" (UID: \"dde388e0-c191-408f-a40e-72c1414c4d14\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032621 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032670 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2e950a56-b252-49e1-b795-4931be982e88-webhook-cert\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032709 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-serving-cert\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032740 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-console-config\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032770 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4j6k\" (UniqueName: \"kubernetes.io/projected/69285d7a-0343-4bd8-a6e5-750bf8051c3e-kube-api-access-k4j6k\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032798 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8242f19f-45e0-4481-9d43-19305274878b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrkhf\" (UID: \"8242f19f-45e0-4481-9d43-19305274878b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032368 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.032967 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-audit\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033007 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n52q\" (UniqueName: \"kubernetes.io/projected/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-kube-api-access-6n52q\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033031 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8a92707c-0d6f-4561-bcda-8997f3c2967d-srv-cert\") pod \"catalog-operator-68c6474976-6zgnb\" (UID: \"8a92707c-0d6f-4561-bcda-8997f3c2967d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033130 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18d99fca-5145-431e-8bf1-8934b783b569-serving-cert\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033171 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-etcd-serving-ca\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033206 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8a92707c-0d6f-4561-bcda-8997f3c2967d-profile-collector-cert\") pod \"catalog-operator-68c6474976-6zgnb\" (UID: \"8a92707c-0d6f-4561-bcda-8997f3c2967d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033238 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033276 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033318 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45l6z\" (UniqueName: \"kubernetes.io/projected/8a92707c-0d6f-4561-bcda-8997f3c2967d-kube-api-access-45l6z\") pod \"catalog-operator-68c6474976-6zgnb\" (UID: \"8a92707c-0d6f-4561-bcda-8997f3c2967d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033354 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-client-ca\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033382 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-etcd-client\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033416 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhxcg\" (UniqueName: \"kubernetes.io/projected/e26a3952-09c7-455b-ac02-a18c778eec8e-kube-api-access-bhxcg\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.033524 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569e4aab-6b67-4448-9e6e-ecab14ebc87e-config\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.034161 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.034193 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-89m48"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.034225 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/569e4aab-6b67-4448-9e6e-ecab14ebc87e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.034327 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-client-ca\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.034362 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz9th\" (UniqueName: \"kubernetes.io/projected/2e950a56-b252-49e1-b795-4931be982e88-kube-api-access-jz9th\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.034396 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ddeb61c7-7fde-4331-abe7-0dc69b173ee1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-k25lw\" (UID: \"ddeb61c7-7fde-4331-abe7-0dc69b173ee1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.034502 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-client-ca\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.034556 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.034892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-config\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.034984 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bb30bae7-1663-4658-acf9-f76adf8d12ea-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tnfh2\" (UID: \"bb30bae7-1663-4658-acf9-f76adf8d12ea\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.035031 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eded1f09-fe44-4693-939c-60335f2d6b22-metrics-tls\") pod \"dns-operator-744455d44c-jwjhn\" (UID: \"eded1f09-fe44-4693-939c-60335f2d6b22\") " pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.035096 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.035167 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.036444 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl5ht\" (UniqueName: \"kubernetes.io/projected/8242f19f-45e0-4481-9d43-19305274878b-kube-api-access-sl5ht\") pod \"openshift-apiserver-operator-796bbdcf4f-jrkhf\" (UID: \"8242f19f-45e0-4481-9d43-19305274878b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.036552 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.036719 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/569e4aab-6b67-4448-9e6e-ecab14ebc87e-images\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.036865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69285d7a-0343-4bd8-a6e5-750bf8051c3e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.036906 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.037215 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-config\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.037940 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bb30bae7-1663-4658-acf9-f76adf8d12ea-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tnfh2\" (UID: \"bb30bae7-1663-4658-acf9-f76adf8d12ea\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.038079 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv648\" (UniqueName: \"kubernetes.io/projected/69e4506e-0adb-495a-b22d-ff5ac9e79afa-kube-api-access-sv648\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.038812 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.039171 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/569e4aab-6b67-4448-9e6e-ecab14ebc87e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.039341 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18d99fca-5145-431e-8bf1-8934b783b569-serving-cert\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.040497 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t6c6p"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.044402 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.047437 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.050755 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d6phv"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.051169 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xwn45"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.054886 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.056162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/569e4aab-6b67-4448-9e6e-ecab14ebc87e-images\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.057540 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sc9wz"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.058124 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.060771 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-72zzh"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.061200 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.062606 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb30bae7-1663-4658-acf9-f76adf8d12ea-serving-cert\") pod \"openshift-config-operator-7777fb866f-tnfh2\" (UID: \"bb30bae7-1663-4658-acf9-f76adf8d12ea\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.064009 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.064239 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.067525 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jwjhn"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.070586 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vf566"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.070656 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.074144 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.074206 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lt258"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.074221 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.075798 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.077352 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.078435 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.079662 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.080663 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cb2vm"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.082009 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vvt6x"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.083055 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-h26dv"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.084457 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.085870 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.086961 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.089000 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dqkqd"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.089993 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dqkqd" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.090558 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.091744 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-2lnn7"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.092313 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2lnn7" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.092764 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.094153 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.095772 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.096055 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dqkqd"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.097377 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-f5jgd"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.098468 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.099563 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-w5ls2"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.101086 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.105892 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.107004 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.108450 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4kntm"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.110282 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.111846 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tmkjf"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.113838 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.115468 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sc9wz"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.115470 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.117022 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zpcds"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.118524 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-72zzh"] Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.135864 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139000 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-image-import-ca\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139088 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-config\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139115 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-service-ca-bundle\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139153 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69285d7a-0343-4bd8-a6e5-750bf8051c3e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139182 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb80de5-75af-4316-a192-3ffac092ffd9-config\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139211 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fcb80de5-75af-4316-a192-3ffac092ffd9-trusted-ca\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139244 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh86f\" (UniqueName: \"kubernetes.io/projected/eded1f09-fe44-4693-939c-60335f2d6b22-kube-api-access-zh86f\") pod \"dns-operator-744455d44c-jwjhn\" (UID: \"eded1f09-fe44-4693-939c-60335f2d6b22\") " pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139348 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139376 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-audit-dir\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139403 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-policies\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139430 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp89s\" (UniqueName: \"kubernetes.io/projected/0bb536d4-f4ae-44ac-8477-0d14b97ebe04-kube-api-access-gp89s\") pod \"olm-operator-6b444d44fb-68gt2\" (UID: \"0bb536d4-f4ae-44ac-8477-0d14b97ebe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139457 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52q5n\" (UniqueName: \"kubernetes.io/projected/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-kube-api-access-52q5n\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139493 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxp67\" (UniqueName: \"kubernetes.io/projected/ddeb61c7-7fde-4331-abe7-0dc69b173ee1-kube-api-access-rxp67\") pod \"cluster-samples-operator-665b6dd947-k25lw\" (UID: \"ddeb61c7-7fde-4331-abe7-0dc69b173ee1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139519 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd6lp\" (UniqueName: \"kubernetes.io/projected/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-kube-api-access-vd6lp\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139553 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139587 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69285d7a-0343-4bd8-a6e5-750bf8051c3e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139656 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-config\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139688 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smj6z\" (UniqueName: \"kubernetes.io/projected/dde388e0-c191-408f-a40e-72c1414c4d14-kube-api-access-smj6z\") pod \"openshift-controller-manager-operator-756b6f6bc6-fvd2g\" (UID: \"dde388e0-c191-408f-a40e-72c1414c4d14\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139732 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139774 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2e950a56-b252-49e1-b795-4931be982e88-webhook-cert\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139803 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-serving-cert\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139829 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-console-config\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139857 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4j6k\" (UniqueName: \"kubernetes.io/projected/69285d7a-0343-4bd8-a6e5-750bf8051c3e-kube-api-access-k4j6k\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139885 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-audit\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139912 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n52q\" (UniqueName: \"kubernetes.io/projected/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-kube-api-access-6n52q\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139941 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8a92707c-0d6f-4561-bcda-8997f3c2967d-srv-cert\") pod \"catalog-operator-68c6474976-6zgnb\" (UID: \"8a92707c-0d6f-4561-bcda-8997f3c2967d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.139969 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8242f19f-45e0-4481-9d43-19305274878b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrkhf\" (UID: \"8242f19f-45e0-4481-9d43-19305274878b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140000 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-etcd-serving-ca\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140027 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8a92707c-0d6f-4561-bcda-8997f3c2967d-profile-collector-cert\") pod \"catalog-operator-68c6474976-6zgnb\" (UID: \"8a92707c-0d6f-4561-bcda-8997f3c2967d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45l6z\" (UniqueName: \"kubernetes.io/projected/8a92707c-0d6f-4561-bcda-8997f3c2967d-kube-api-access-45l6z\") pod \"catalog-operator-68c6474976-6zgnb\" (UID: \"8a92707c-0d6f-4561-bcda-8997f3c2967d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140119 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140144 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140171 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-service-ca-bundle\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140177 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-etcd-client\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140287 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhxcg\" (UniqueName: \"kubernetes.io/projected/e26a3952-09c7-455b-ac02-a18c778eec8e-kube-api-access-bhxcg\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140320 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-client-ca\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140341 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140349 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-image-import-ca\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140384 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz9th\" (UniqueName: \"kubernetes.io/projected/2e950a56-b252-49e1-b795-4931be982e88-kube-api-access-jz9th\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140447 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ddeb61c7-7fde-4331-abe7-0dc69b173ee1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-k25lw\" (UID: \"ddeb61c7-7fde-4331-abe7-0dc69b173ee1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140493 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140531 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eded1f09-fe44-4693-939c-60335f2d6b22-metrics-tls\") pod \"dns-operator-744455d44c-jwjhn\" (UID: \"eded1f09-fe44-4693-939c-60335f2d6b22\") " pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140541 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-policies\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140577 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140618 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69285d7a-0343-4bd8-a6e5-750bf8051c3e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140652 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-config\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140645 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140753 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl5ht\" (UniqueName: \"kubernetes.io/projected/8242f19f-45e0-4481-9d43-19305274878b-kube-api-access-sl5ht\") pod \"openshift-apiserver-operator-796bbdcf4f-jrkhf\" (UID: \"8242f19f-45e0-4481-9d43-19305274878b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140837 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv648\" (UniqueName: \"kubernetes.io/projected/69e4506e-0adb-495a-b22d-ff5ac9e79afa-kube-api-access-sv648\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-trusted-ca-bundle\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140921 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0bb536d4-f4ae-44ac-8477-0d14b97ebe04-profile-collector-cert\") pod \"olm-operator-6b444d44fb-68gt2\" (UID: \"0bb536d4-f4ae-44ac-8477-0d14b97ebe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140953 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dde388e0-c191-408f-a40e-72c1414c4d14-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fvd2g\" (UID: \"dde388e0-c191-408f-a40e-72c1414c4d14\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.140979 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-dir\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141007 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0bb536d4-f4ae-44ac-8477-0d14b97ebe04-srv-cert\") pod \"olm-operator-6b444d44fb-68gt2\" (UID: \"0bb536d4-f4ae-44ac-8477-0d14b97ebe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141077 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b347bcd3-0e23-40a4-8e27-9140db184474-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-f5jgd\" (UID: \"b347bcd3-0e23-40a4-8e27-9140db184474\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdx7h\" (UniqueName: \"kubernetes.io/projected/b347bcd3-0e23-40a4-8e27-9140db184474-kube-api-access-xdx7h\") pod \"multus-admission-controller-857f4d67dd-f5jgd\" (UID: \"b347bcd3-0e23-40a4-8e27-9140db184474\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141134 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-etcd-client\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141239 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9npn8\" (UniqueName: \"kubernetes.io/projected/fcb80de5-75af-4316-a192-3ffac092ffd9-kube-api-access-9npn8\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141273 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-audit-dir\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141288 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-client-ca\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141299 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-audit-policies\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141346 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8acf9922-28d9-410b-b416-6685314b9964-auth-proxy-config\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141369 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8acf9922-28d9-410b-b416-6685314b9964-machine-approver-tls\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141390 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-node-pullsecrets\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141408 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-service-ca\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141429 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-oauth-serving-cert\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141438 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb80de5-75af-4316-a192-3ffac092ffd9-config\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141447 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-encryption-config\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141511 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141608 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-audit-dir\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141640 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-config\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141663 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxth\" (UniqueName: \"kubernetes.io/projected/75a6e760-8173-4942-a194-297cce124b98-kube-api-access-chxth\") pod \"downloads-7954f5f757-w5ls2\" (UID: \"75a6e760-8173-4942-a194-297cce124b98\") " pod="openshift-console/downloads-7954f5f757-w5ls2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141715 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8242f19f-45e0-4481-9d43-19305274878b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrkhf\" (UID: \"8242f19f-45e0-4481-9d43-19305274878b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141730 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141748 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-encryption-config\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141772 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcb80de5-75af-4316-a192-3ffac092ffd9-serving-cert\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141796 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-oauth-config\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141815 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8acf9922-28d9-410b-b416-6685314b9964-config\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.141836 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde388e0-c191-408f-a40e-72c1414c4d14-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fvd2g\" (UID: \"dde388e0-c191-408f-a40e-72c1414c4d14\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.142115 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-audit-policies\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.142663 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.143076 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8acf9922-28d9-410b-b416-6685314b9964-config\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.143774 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-audit\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.144031 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.144086 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8242f19f-45e0-4481-9d43-19305274878b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrkhf\" (UID: \"8242f19f-45e0-4481-9d43-19305274878b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.144366 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-etcd-client\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.144434 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-config\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.144542 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-audit-dir\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.145157 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-node-pullsecrets\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.145466 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.145776 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-trusted-ca-bundle\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.145929 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8acf9922-28d9-410b-b416-6685314b9964-auth-proxy-config\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.146457 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-encryption-config\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.146543 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fcb80de5-75af-4316-a192-3ffac092ffd9-trusted-ca\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.146787 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-dir\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.150437 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.150869 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69285d7a-0343-4bd8-a6e5-750bf8051c3e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.150925 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.151080 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-config\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.151960 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.152504 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.152636 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-etcd-client\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.152922 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153004 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e4506e-0adb-495a-b22d-ff5ac9e79afa-serving-cert\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153320 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153357 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjpdz\" (UniqueName: \"kubernetes.io/projected/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-kube-api-access-zjpdz\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153358 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-service-ca\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153389 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-serving-cert\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153413 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2e950a56-b252-49e1-b795-4931be982e88-tmpfs\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153433 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2e950a56-b252-49e1-b795-4931be982e88-apiservice-cert\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153466 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-serving-cert\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153485 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153506 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-serving-cert\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153529 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xms9\" (UniqueName: \"kubernetes.io/projected/8acf9922-28d9-410b-b416-6685314b9964-kube-api-access-4xms9\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.153600 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-console-config\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.154247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.154342 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2e950a56-b252-49e1-b795-4931be982e88-tmpfs\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.154513 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-serving-cert\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.154941 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ddeb61c7-7fde-4331-abe7-0dc69b173ee1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-k25lw\" (UID: \"ddeb61c7-7fde-4331-abe7-0dc69b173ee1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.155144 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-oauth-serving-cert\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.155345 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-oauth-config\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.155434 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2e950a56-b252-49e1-b795-4931be982e88-webhook-cert\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.155784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8acf9922-28d9-410b-b416-6685314b9964-machine-approver-tls\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.156292 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-serving-cert\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.156387 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eded1f09-fe44-4693-939c-60335f2d6b22-metrics-tls\") pod \"dns-operator-744455d44c-jwjhn\" (UID: \"eded1f09-fe44-4693-939c-60335f2d6b22\") " pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.156928 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde388e0-c191-408f-a40e-72c1414c4d14-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fvd2g\" (UID: \"dde388e0-c191-408f-a40e-72c1414c4d14\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.157648 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-serving-cert\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.157797 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/69285d7a-0343-4bd8-a6e5-750bf8051c3e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.158219 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.158641 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8242f19f-45e0-4481-9d43-19305274878b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrkhf\" (UID: \"8242f19f-45e0-4481-9d43-19305274878b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.159119 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.159430 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e4506e-0adb-495a-b22d-ff5ac9e79afa-serving-cert\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.159450 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcb80de5-75af-4316-a192-3ffac092ffd9-serving-cert\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.159873 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-encryption-config\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.160249 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dde388e0-c191-408f-a40e-72c1414c4d14-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fvd2g\" (UID: \"dde388e0-c191-408f-a40e-72c1414c4d14\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.160962 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-etcd-serving-ca\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.160976 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-serving-cert\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.161822 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2e950a56-b252-49e1-b795-4931be982e88-apiservice-cert\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.162174 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.162403 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.164428 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.164855 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.166110 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b347bcd3-0e23-40a4-8e27-9140db184474-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-f5jgd\" (UID: \"b347bcd3-0e23-40a4-8e27-9140db184474\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.175843 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.195860 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.201478 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0bb536d4-f4ae-44ac-8477-0d14b97ebe04-profile-collector-cert\") pod \"olm-operator-6b444d44fb-68gt2\" (UID: \"0bb536d4-f4ae-44ac-8477-0d14b97ebe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.206579 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8a92707c-0d6f-4561-bcda-8997f3c2967d-profile-collector-cert\") pod \"catalog-operator-68c6474976-6zgnb\" (UID: \"8a92707c-0d6f-4561-bcda-8997f3c2967d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.215815 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.236358 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8a92707c-0d6f-4561-bcda-8997f3c2967d-srv-cert\") pod \"catalog-operator-68c6474976-6zgnb\" (UID: \"8a92707c-0d6f-4561-bcda-8997f3c2967d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.238317 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.247410 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0bb536d4-f4ae-44ac-8477-0d14b97ebe04-srv-cert\") pod \"olm-operator-6b444d44fb-68gt2\" (UID: \"0bb536d4-f4ae-44ac-8477-0d14b97ebe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.264462 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.275216 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.294993 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.315444 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.335479 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.354879 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.375226 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.382809 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.382834 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.382809 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.395127 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.414815 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.434730 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.455272 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.495383 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.515827 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.534850 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.556014 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.577087 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.596108 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.616035 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.635255 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.656716 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.675251 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.695430 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.715683 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.736216 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.755369 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.775506 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.796455 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.815127 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.835954 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.856566 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.876744 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.895446 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.921715 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.934730 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.955593 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.976295 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 21 15:25:21 crc kubenswrapper[4773]: I0121 15:25:21.996136 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.013958 4773 request.go:700] Waited for 1.002699569s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.016296 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.035523 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.056208 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.076076 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.095785 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.115606 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.135272 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.156328 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.176025 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.196174 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.217675 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.234831 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.255038 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.275126 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.295865 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.315465 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.336420 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.356254 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.376106 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.383422 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.399822 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.415803 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.435409 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.455323 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.492984 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcr8p\" (UniqueName: \"kubernetes.io/projected/bb30bae7-1663-4658-acf9-f76adf8d12ea-kube-api-access-bcr8p\") pod \"openshift-config-operator-7777fb866f-tnfh2\" (UID: \"bb30bae7-1663-4658-acf9-f76adf8d12ea\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.495086 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.516240 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.556085 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqbgs\" (UniqueName: \"kubernetes.io/projected/18d99fca-5145-431e-8bf1-8934b783b569-kube-api-access-qqbgs\") pod \"controller-manager-879f6c89f-89m48\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.576040 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wndbm\" (UniqueName: \"kubernetes.io/projected/569e4aab-6b67-4448-9e6e-ecab14ebc87e-kube-api-access-wndbm\") pod \"machine-api-operator-5694c8668f-t6c6p\" (UID: \"569e4aab-6b67-4448-9e6e-ecab14ebc87e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.578373 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.595996 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.625164 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.636108 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.655984 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.678858 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.680456 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.696304 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.697376 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.715801 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.734679 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.735512 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.756168 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.776132 4773 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.801039 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.835132 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.855659 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.869750 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t6c6p"] Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.880322 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.898812 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.904294 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-89m48"] Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.916015 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.934286 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.948777 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2"] Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.955339 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 21 15:25:22 crc kubenswrapper[4773]: W0121 15:25:22.957811 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb30bae7_1663_4658_acf9_f76adf8d12ea.slice/crio-ff3b564dfce3718c8c8bca697791b7bf23f4ec6c4f68697a6709f3c97df6a156 WatchSource:0}: Error finding container ff3b564dfce3718c8c8bca697791b7bf23f4ec6c4f68697a6709f3c97df6a156: Status 404 returned error can't find the container with id ff3b564dfce3718c8c8bca697791b7bf23f4ec6c4f68697a6709f3c97df6a156 Jan 21 15:25:22 crc kubenswrapper[4773]: I0121 15:25:22.991357 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz9th\" (UniqueName: \"kubernetes.io/projected/2e950a56-b252-49e1-b795-4931be982e88-kube-api-access-jz9th\") pod \"packageserver-d55dfcdfc-946gq\" (UID: \"2e950a56-b252-49e1-b795-4931be982e88\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.008647 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhxcg\" (UniqueName: \"kubernetes.io/projected/e26a3952-09c7-455b-ac02-a18c778eec8e-kube-api-access-bhxcg\") pod \"console-f9d7485db-xwn45\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.027549 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp89s\" (UniqueName: \"kubernetes.io/projected/0bb536d4-f4ae-44ac-8477-0d14b97ebe04-kube-api-access-gp89s\") pod \"olm-operator-6b444d44fb-68gt2\" (UID: \"0bb536d4-f4ae-44ac-8477-0d14b97ebe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.033768 4773 request.go:700] Waited for 1.8923464s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.050951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxp67\" (UniqueName: \"kubernetes.io/projected/ddeb61c7-7fde-4331-abe7-0dc69b173ee1-kube-api-access-rxp67\") pod \"cluster-samples-operator-665b6dd947-k25lw\" (UID: \"ddeb61c7-7fde-4331-abe7-0dc69b173ee1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.071924 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd6lp\" (UniqueName: \"kubernetes.io/projected/1aaad209-017c-4f1a-af2b-7bdb507ed1a0-kube-api-access-vd6lp\") pod \"apiserver-7bbb656c7d-pzhf4\" (UID: \"1aaad209-017c-4f1a-af2b-7bdb507ed1a0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.086068 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.095446 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl5ht\" (UniqueName: \"kubernetes.io/projected/8242f19f-45e0-4481-9d43-19305274878b-kube-api-access-sl5ht\") pod \"openshift-apiserver-operator-796bbdcf4f-jrkhf\" (UID: \"8242f19f-45e0-4481-9d43-19305274878b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.110478 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n52q\" (UniqueName: \"kubernetes.io/projected/454bfb09-e6a5-4e20-af7b-9aa8a52ca678-kube-api-access-6n52q\") pod \"apiserver-76f77b778f-vf566\" (UID: \"454bfb09-e6a5-4e20-af7b-9aa8a52ca678\") " pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.131374 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69285d7a-0343-4bd8-a6e5-750bf8051c3e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.153209 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.165575 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv648\" (UniqueName: \"kubernetes.io/projected/69e4506e-0adb-495a-b22d-ff5ac9e79afa-kube-api-access-sv648\") pod \"route-controller-manager-6576b87f9c-6x99r\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.174674 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdx7h\" (UniqueName: \"kubernetes.io/projected/b347bcd3-0e23-40a4-8e27-9140db184474-kube-api-access-xdx7h\") pod \"multus-admission-controller-857f4d67dd-f5jgd\" (UID: \"b347bcd3-0e23-40a4-8e27-9140db184474\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.192372 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.192914 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52q5n\" (UniqueName: \"kubernetes.io/projected/cd6b003a-6a86-41a8-849c-c2c30fdcdbf4-kube-api-access-52q5n\") pod \"authentication-operator-69f744f599-vvt6x\" (UID: \"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.214584 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npn8\" (UniqueName: \"kubernetes.io/projected/fcb80de5-75af-4316-a192-3ffac092ffd9-kube-api-access-9npn8\") pod \"console-operator-58897d9998-d6phv\" (UID: \"fcb80de5-75af-4316-a192-3ffac092ffd9\") " pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.216419 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.229491 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.235275 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.237060 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh86f\" (UniqueName: \"kubernetes.io/projected/eded1f09-fe44-4693-939c-60335f2d6b22-kube-api-access-zh86f\") pod \"dns-operator-744455d44c-jwjhn\" (UID: \"eded1f09-fe44-4693-939c-60335f2d6b22\") " pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.247800 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.267343 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxth\" (UniqueName: \"kubernetes.io/projected/75a6e760-8173-4942-a194-297cce124b98-kube-api-access-chxth\") pod \"downloads-7954f5f757-w5ls2\" (UID: \"75a6e760-8173-4942-a194-297cce124b98\") " pod="openshift-console/downloads-7954f5f757-w5ls2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.278235 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45l6z\" (UniqueName: \"kubernetes.io/projected/8a92707c-0d6f-4561-bcda-8997f3c2967d-kube-api-access-45l6z\") pod \"catalog-operator-68c6474976-6zgnb\" (UID: \"8a92707c-0d6f-4561-bcda-8997f3c2967d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.281385 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4"] Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.312379 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.315436 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smj6z\" (UniqueName: \"kubernetes.io/projected/dde388e0-c191-408f-a40e-72c1414c4d14-kube-api-access-smj6z\") pod \"openshift-controller-manager-operator-756b6f6bc6-fvd2g\" (UID: \"dde388e0-c191-408f-a40e-72c1414c4d14\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.315945 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4j6k\" (UniqueName: \"kubernetes.io/projected/69285d7a-0343-4bd8-a6e5-750bf8051c3e-kube-api-access-k4j6k\") pod \"cluster-image-registry-operator-dc59b4c8b-p95pw\" (UID: \"69285d7a-0343-4bd8-a6e5-750bf8051c3e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.321414 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.339422 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xms9\" (UniqueName: \"kubernetes.io/projected/8acf9922-28d9-410b-b416-6685314b9964-kube-api-access-4xms9\") pod \"machine-approver-56656f9798-df4cw\" (UID: \"8acf9922-28d9-410b-b416-6685314b9964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.356879 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.360641 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjpdz\" (UniqueName: \"kubernetes.io/projected/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-kube-api-access-zjpdz\") pod \"oauth-openshift-558db77b4-lt258\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.374620 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.375510 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.395824 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.407594 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.415214 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.417066 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xwn45"] Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.421561 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" Jan 21 15:25:23 crc kubenswrapper[4773]: W0121 15:25:23.431643 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode26a3952_09c7_455b_ac02_a18c778eec8e.slice/crio-a4f67995b950b15aa0355cdb8768036c75a86a85beb3e8b21376f870a16a2cbf WatchSource:0}: Error finding container a4f67995b950b15aa0355cdb8768036c75a86a85beb3e8b21376f870a16a2cbf: Status 404 returned error can't find the container with id a4f67995b950b15aa0355cdb8768036c75a86a85beb3e8b21376f870a16a2cbf Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.442279 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.461322 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.470965 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.475447 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.475456 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-w5ls2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.485094 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.496375 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.498929 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.508562 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.594476 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcacbd95-339a-4b0c-985e-e5daf21c2661-config\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.594996 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4b6b777c-9c68-48ed-b82f-aa8edf4d3361-signing-cabundle\") pod \"service-ca-9c57cc56f-4kntm\" (UID: \"4b6b777c-9c68-48ed-b82f-aa8edf4d3361\") " pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595029 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c3a2458-cc1f-489a-9ce5-57d651ea1754-secret-volume\") pod \"collect-profiles-29483475-7ftkl\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595050 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85lrb\" (UniqueName: \"kubernetes.io/projected/b9674283-432f-442d-b7f9-f345f5f4da4b-kube-api-access-85lrb\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595077 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595092 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-trusted-ca\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595132 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d53682c7-0453-4402-aa06-1724da149b3e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5b9bq\" (UID: \"d53682c7-0453-4402-aa06-1724da149b3e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595225 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f84504b-c80c-44eb-a333-ce41e3e2a4f0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hhn45\" (UID: \"4f84504b-c80c-44eb-a333-ce41e3e2a4f0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595257 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-registration-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595281 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-plugins-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595318 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-trusted-ca\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595394 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl5w7\" (UniqueName: \"kubernetes.io/projected/76efaf19-7338-44b6-9eda-d69bd479a2de-kube-api-access-sl5w7\") pod \"service-ca-operator-777779d784-h26dv\" (UID: \"76efaf19-7338-44b6-9eda-d69bd479a2de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595448 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f299f3-e9af-47b7-9790-c584c36a976f-config\") pod \"kube-controller-manager-operator-78b949d7b-7lcd2\" (UID: \"32f299f3-e9af-47b7-9790-c584c36a976f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595468 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d0b69ab-8c30-4657-999c-bac2341de0bb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lmndg\" (UID: \"8d0b69ab-8c30-4657-999c-bac2341de0bb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595486 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm2cx\" (UniqueName: \"kubernetes.io/projected/d53682c7-0453-4402-aa06-1724da149b3e-kube-api-access-sm2cx\") pod \"kube-storage-version-migrator-operator-b67b599dd-5b9bq\" (UID: \"d53682c7-0453-4402-aa06-1724da149b3e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595503 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67c5c456-b7d7-42b9-842d-eccf213bef77-proxy-tls\") pod \"machine-config-controller-84d6567774-zmvbt\" (UID: \"67c5c456-b7d7-42b9-842d-eccf213bef77\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595539 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dcacbd95-339a-4b0c-985e-e5daf21c2661-etcd-client\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595558 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xvbd\" (UniqueName: \"kubernetes.io/projected/8d0b69ab-8c30-4657-999c-bac2341de0bb-kube-api-access-2xvbd\") pod \"package-server-manager-789f6589d5-lmndg\" (UID: \"8d0b69ab-8c30-4657-999c-bac2341de0bb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595587 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9674283-432f-442d-b7f9-f345f5f4da4b-proxy-tls\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595623 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxgk6\" (UniqueName: \"kubernetes.io/projected/dcacbd95-339a-4b0c-985e-e5daf21c2661-kube-api-access-wxgk6\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.595640 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9674283-432f-442d-b7f9-f345f5f4da4b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.596755 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d53682c7-0453-4402-aa06-1724da149b3e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5b9bq\" (UID: \"d53682c7-0453-4402-aa06-1724da149b3e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.596793 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/44c8da7a-ff65-4275-bf72-bdd8da929a4a-default-certificate\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.596842 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf4k9\" (UniqueName: \"kubernetes.io/projected/44c8da7a-ff65-4275-bf72-bdd8da929a4a-kube-api-access-rf4k9\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.596859 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc66079f-3257-43a2-9546-276e039d3442-metrics-tls\") pod \"dns-default-sc9wz\" (UID: \"cc66079f-3257-43a2-9546-276e039d3442\") " pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.596887 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44c8da7a-ff65-4275-bf72-bdd8da929a4a-metrics-certs\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.596907 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ced1c123-8896-4ab6-9896-d4ddd0de959f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w9wsc\" (UID: \"ced1c123-8896-4ab6-9896-d4ddd0de959f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.596926 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/262910dc-030f-4767-833a-507c1a280963-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zpcds\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.596943 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzcz2\" (UniqueName: \"kubernetes.io/projected/cc790b88-a203-4b87-9c4b-3daa0961208d-kube-api-access-xzcz2\") pod \"migrator-59844c95c7-x4pnk\" (UID: \"cc790b88-a203-4b87-9c4b-3daa0961208d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76efaf19-7338-44b6-9eda-d69bd479a2de-serving-cert\") pod \"service-ca-operator-777779d784-h26dv\" (UID: \"76efaf19-7338-44b6-9eda-d69bd479a2de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597052 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcacbd95-339a-4b0c-985e-e5daf21c2661-etcd-service-ca\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597078 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f84504b-c80c-44eb-a333-ce41e3e2a4f0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hhn45\" (UID: \"4f84504b-c80c-44eb-a333-ce41e3e2a4f0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597116 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcacbd95-339a-4b0c-985e-e5daf21c2661-serving-cert\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597132 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghkgg\" (UniqueName: \"kubernetes.io/projected/67c5c456-b7d7-42b9-842d-eccf213bef77-kube-api-access-ghkgg\") pod \"machine-config-controller-84d6567774-zmvbt\" (UID: \"67c5c456-b7d7-42b9-842d-eccf213bef77\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597176 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksc8d\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-kube-api-access-ksc8d\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597191 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced1c123-8896-4ab6-9896-d4ddd0de959f-config\") pod \"kube-apiserver-operator-766d6c64bb-w9wsc\" (UID: \"ced1c123-8896-4ab6-9896-d4ddd0de959f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g22w\" (UniqueName: \"kubernetes.io/projected/cc66079f-3257-43a2-9546-276e039d3442-kube-api-access-6g22w\") pod \"dns-default-sc9wz\" (UID: \"cc66079f-3257-43a2-9546-276e039d3442\") " pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597225 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dcacbd95-339a-4b0c-985e-e5daf21c2661-etcd-ca\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597252 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597268 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc66079f-3257-43a2-9546-276e039d3442-config-volume\") pod \"dns-default-sc9wz\" (UID: \"cc66079f-3257-43a2-9546-276e039d3442\") " pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597299 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597315 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdjqt\" (UniqueName: \"kubernetes.io/projected/0e296aaa-2bcb-48cc-98de-ddc913780b66-kube-api-access-rdjqt\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597332 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cafc9bd5-4993-4fcf-ba6d-91028b10e7e8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7pqk4\" (UID: \"cafc9bd5-4993-4fcf-ba6d-91028b10e7e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597358 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-certificates\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597371 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c3a2458-cc1f-489a-9ce5-57d651ea1754-config-volume\") pod \"collect-profiles-29483475-7ftkl\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597407 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgl8h\" (UniqueName: \"kubernetes.io/projected/cafc9bd5-4993-4fcf-ba6d-91028b10e7e8-kube-api-access-lgl8h\") pod \"control-plane-machine-set-operator-78cbb6b69f-7pqk4\" (UID: \"cafc9bd5-4993-4fcf-ba6d-91028b10e7e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597457 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c8da7a-ff65-4275-bf72-bdd8da929a4a-service-ca-bundle\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597482 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32f299f3-e9af-47b7-9790-c584c36a976f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7lcd2\" (UID: \"32f299f3-e9af-47b7-9790-c584c36a976f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597497 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b9674283-432f-442d-b7f9-f345f5f4da4b-images\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597531 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-bound-sa-token\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597555 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-socket-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597569 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32f299f3-e9af-47b7-9790-c584c36a976f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7lcd2\" (UID: \"32f299f3-e9af-47b7-9790-c584c36a976f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2m99\" (UniqueName: \"kubernetes.io/projected/8c3a2458-cc1f-489a-9ce5-57d651ea1754-kube-api-access-v2m99\") pod \"collect-profiles-29483475-7ftkl\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597643 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-mountpoint-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597711 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/262910dc-030f-4767-833a-507c1a280963-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zpcds\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597730 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4b6b777c-9c68-48ed-b82f-aa8edf4d3361-signing-key\") pod \"service-ca-9c57cc56f-4kntm\" (UID: \"4b6b777c-9c68-48ed-b82f-aa8edf4d3361\") " pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597761 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-metrics-tls\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597780 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8jg8\" (UniqueName: \"kubernetes.io/projected/4b6b777c-9c68-48ed-b82f-aa8edf4d3361-kube-api-access-q8jg8\") pod \"service-ca-9c57cc56f-4kntm\" (UID: \"4b6b777c-9c68-48ed-b82f-aa8edf4d3361\") " pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597818 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67c5c456-b7d7-42b9-842d-eccf213bef77-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zmvbt\" (UID: \"67c5c456-b7d7-42b9-842d-eccf213bef77\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597845 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr7ft\" (UniqueName: \"kubernetes.io/projected/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-kube-api-access-tr7ft\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597923 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-csi-data-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597938 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f84504b-c80c-44eb-a333-ce41e3e2a4f0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hhn45\" (UID: \"4f84504b-c80c-44eb-a333-ce41e3e2a4f0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597974 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76efaf19-7338-44b6-9eda-d69bd479a2de-config\") pod \"service-ca-operator-777779d784-h26dv\" (UID: \"76efaf19-7338-44b6-9eda-d69bd479a2de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.597989 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hw9b\" (UniqueName: \"kubernetes.io/projected/262910dc-030f-4767-833a-507c1a280963-kube-api-access-6hw9b\") pod \"marketplace-operator-79b997595-zpcds\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.598053 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/44c8da7a-ff65-4275-bf72-bdd8da929a4a-stats-auth\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.598086 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-tls\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.598100 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.598114 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ced1c123-8896-4ab6-9896-d4ddd0de959f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w9wsc\" (UID: \"ced1c123-8896-4ab6-9896-d4ddd0de959f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" Jan 21 15:25:23 crc kubenswrapper[4773]: E0121 15:25:23.599386 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:24.099372141 +0000 UTC m=+89.023861763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.618901 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vf566"] Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.699549 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700111 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxgk6\" (UniqueName: \"kubernetes.io/projected/dcacbd95-339a-4b0c-985e-e5daf21c2661-kube-api-access-wxgk6\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700155 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9674283-432f-442d-b7f9-f345f5f4da4b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700185 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d53682c7-0453-4402-aa06-1724da149b3e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5b9bq\" (UID: \"d53682c7-0453-4402-aa06-1724da149b3e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700219 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/44c8da7a-ff65-4275-bf72-bdd8da929a4a-default-certificate\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700239 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf4k9\" (UniqueName: \"kubernetes.io/projected/44c8da7a-ff65-4275-bf72-bdd8da929a4a-kube-api-access-rf4k9\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700262 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44c8da7a-ff65-4275-bf72-bdd8da929a4a-metrics-certs\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700300 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ced1c123-8896-4ab6-9896-d4ddd0de959f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w9wsc\" (UID: \"ced1c123-8896-4ab6-9896-d4ddd0de959f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700325 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc66079f-3257-43a2-9546-276e039d3442-metrics-tls\") pod \"dns-default-sc9wz\" (UID: \"cc66079f-3257-43a2-9546-276e039d3442\") " pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700349 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/262910dc-030f-4767-833a-507c1a280963-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zpcds\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700370 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzcz2\" (UniqueName: \"kubernetes.io/projected/cc790b88-a203-4b87-9c4b-3daa0961208d-kube-api-access-xzcz2\") pod \"migrator-59844c95c7-x4pnk\" (UID: \"cc790b88-a203-4b87-9c4b-3daa0961208d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700398 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def72881-27b4-4161-b94b-3507c648a197-cert\") pod \"ingress-canary-dqkqd\" (UID: \"def72881-27b4-4161-b94b-3507c648a197\") " pod="openshift-ingress-canary/ingress-canary-dqkqd" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700437 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76efaf19-7338-44b6-9eda-d69bd479a2de-serving-cert\") pod \"service-ca-operator-777779d784-h26dv\" (UID: \"76efaf19-7338-44b6-9eda-d69bd479a2de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700456 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79spw\" (UniqueName: \"kubernetes.io/projected/def72881-27b4-4161-b94b-3507c648a197-kube-api-access-79spw\") pod \"ingress-canary-dqkqd\" (UID: \"def72881-27b4-4161-b94b-3507c648a197\") " pod="openshift-ingress-canary/ingress-canary-dqkqd" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700478 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcacbd95-339a-4b0c-985e-e5daf21c2661-etcd-service-ca\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700499 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f84504b-c80c-44eb-a333-ce41e3e2a4f0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hhn45\" (UID: \"4f84504b-c80c-44eb-a333-ce41e3e2a4f0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700515 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcacbd95-339a-4b0c-985e-e5daf21c2661-serving-cert\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700531 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghkgg\" (UniqueName: \"kubernetes.io/projected/67c5c456-b7d7-42b9-842d-eccf213bef77-kube-api-access-ghkgg\") pod \"machine-config-controller-84d6567774-zmvbt\" (UID: \"67c5c456-b7d7-42b9-842d-eccf213bef77\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700550 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksc8d\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-kube-api-access-ksc8d\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700567 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced1c123-8896-4ab6-9896-d4ddd0de959f-config\") pod \"kube-apiserver-operator-766d6c64bb-w9wsc\" (UID: \"ced1c123-8896-4ab6-9896-d4ddd0de959f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g22w\" (UniqueName: \"kubernetes.io/projected/cc66079f-3257-43a2-9546-276e039d3442-kube-api-access-6g22w\") pod \"dns-default-sc9wz\" (UID: \"cc66079f-3257-43a2-9546-276e039d3442\") " pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700599 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dcacbd95-339a-4b0c-985e-e5daf21c2661-etcd-ca\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700613 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc66079f-3257-43a2-9546-276e039d3442-config-volume\") pod \"dns-default-sc9wz\" (UID: \"cc66079f-3257-43a2-9546-276e039d3442\") " pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700641 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700658 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdjqt\" (UniqueName: \"kubernetes.io/projected/0e296aaa-2bcb-48cc-98de-ddc913780b66-kube-api-access-rdjqt\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700718 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cafc9bd5-4993-4fcf-ba6d-91028b10e7e8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7pqk4\" (UID: \"cafc9bd5-4993-4fcf-ba6d-91028b10e7e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700741 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-certificates\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700757 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c3a2458-cc1f-489a-9ce5-57d651ea1754-config-volume\") pod \"collect-profiles-29483475-7ftkl\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700771 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgl8h\" (UniqueName: \"kubernetes.io/projected/cafc9bd5-4993-4fcf-ba6d-91028b10e7e8-kube-api-access-lgl8h\") pod \"control-plane-machine-set-operator-78cbb6b69f-7pqk4\" (UID: \"cafc9bd5-4993-4fcf-ba6d-91028b10e7e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700795 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c8da7a-ff65-4275-bf72-bdd8da929a4a-service-ca-bundle\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700809 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b9674283-432f-442d-b7f9-f345f5f4da4b-images\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700824 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32f299f3-e9af-47b7-9790-c584c36a976f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7lcd2\" (UID: \"32f299f3-e9af-47b7-9790-c584c36a976f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700844 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-bound-sa-token\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700860 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32f299f3-e9af-47b7-9790-c584c36a976f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7lcd2\" (UID: \"32f299f3-e9af-47b7-9790-c584c36a976f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700875 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4b27839b-91f9-4b6f-a70c-87f7f73928b2-certs\") pod \"machine-config-server-2lnn7\" (UID: \"4b27839b-91f9-4b6f-a70c-87f7f73928b2\") " pod="openshift-machine-config-operator/machine-config-server-2lnn7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-socket-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700906 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4b27839b-91f9-4b6f-a70c-87f7f73928b2-node-bootstrap-token\") pod \"machine-config-server-2lnn7\" (UID: \"4b27839b-91f9-4b6f-a70c-87f7f73928b2\") " pod="openshift-machine-config-operator/machine-config-server-2lnn7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700925 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2m99\" (UniqueName: \"kubernetes.io/projected/8c3a2458-cc1f-489a-9ce5-57d651ea1754-kube-api-access-v2m99\") pod \"collect-profiles-29483475-7ftkl\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700948 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-mountpoint-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700971 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/262910dc-030f-4767-833a-507c1a280963-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zpcds\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.700988 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4b6b777c-9c68-48ed-b82f-aa8edf4d3361-signing-key\") pod \"service-ca-9c57cc56f-4kntm\" (UID: \"4b6b777c-9c68-48ed-b82f-aa8edf4d3361\") " pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701003 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-metrics-tls\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701018 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8jg8\" (UniqueName: \"kubernetes.io/projected/4b6b777c-9c68-48ed-b82f-aa8edf4d3361-kube-api-access-q8jg8\") pod \"service-ca-9c57cc56f-4kntm\" (UID: \"4b6b777c-9c68-48ed-b82f-aa8edf4d3361\") " pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701037 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67c5c456-b7d7-42b9-842d-eccf213bef77-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zmvbt\" (UID: \"67c5c456-b7d7-42b9-842d-eccf213bef77\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701066 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr7ft\" (UniqueName: \"kubernetes.io/projected/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-kube-api-access-tr7ft\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701087 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6rl7\" (UniqueName: \"kubernetes.io/projected/4b27839b-91f9-4b6f-a70c-87f7f73928b2-kube-api-access-c6rl7\") pod \"machine-config-server-2lnn7\" (UID: \"4b27839b-91f9-4b6f-a70c-87f7f73928b2\") " pod="openshift-machine-config-operator/machine-config-server-2lnn7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-csi-data-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701121 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f84504b-c80c-44eb-a333-ce41e3e2a4f0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hhn45\" (UID: \"4f84504b-c80c-44eb-a333-ce41e3e2a4f0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701137 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76efaf19-7338-44b6-9eda-d69bd479a2de-config\") pod \"service-ca-operator-777779d784-h26dv\" (UID: \"76efaf19-7338-44b6-9eda-d69bd479a2de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701153 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hw9b\" (UniqueName: \"kubernetes.io/projected/262910dc-030f-4767-833a-507c1a280963-kube-api-access-6hw9b\") pod \"marketplace-operator-79b997595-zpcds\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701171 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-tls\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701187 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/44c8da7a-ff65-4275-bf72-bdd8da929a4a-stats-auth\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701202 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701217 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ced1c123-8896-4ab6-9896-d4ddd0de959f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w9wsc\" (UID: \"ced1c123-8896-4ab6-9896-d4ddd0de959f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcacbd95-339a-4b0c-985e-e5daf21c2661-config\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701250 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4b6b777c-9c68-48ed-b82f-aa8edf4d3361-signing-cabundle\") pod \"service-ca-9c57cc56f-4kntm\" (UID: \"4b6b777c-9c68-48ed-b82f-aa8edf4d3361\") " pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701266 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c3a2458-cc1f-489a-9ce5-57d651ea1754-secret-volume\") pod \"collect-profiles-29483475-7ftkl\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701282 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85lrb\" (UniqueName: \"kubernetes.io/projected/b9674283-432f-442d-b7f9-f345f5f4da4b-kube-api-access-85lrb\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701299 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701316 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-trusted-ca\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701336 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d53682c7-0453-4402-aa06-1724da149b3e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5b9bq\" (UID: \"d53682c7-0453-4402-aa06-1724da149b3e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701350 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f84504b-c80c-44eb-a333-ce41e3e2a4f0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hhn45\" (UID: \"4f84504b-c80c-44eb-a333-ce41e3e2a4f0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701367 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-registration-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701383 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-plugins-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701402 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-trusted-ca\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701419 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl5w7\" (UniqueName: \"kubernetes.io/projected/76efaf19-7338-44b6-9eda-d69bd479a2de-kube-api-access-sl5w7\") pod \"service-ca-operator-777779d784-h26dv\" (UID: \"76efaf19-7338-44b6-9eda-d69bd479a2de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701434 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f299f3-e9af-47b7-9790-c584c36a976f-config\") pod \"kube-controller-manager-operator-78b949d7b-7lcd2\" (UID: \"32f299f3-e9af-47b7-9790-c584c36a976f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d0b69ab-8c30-4657-999c-bac2341de0bb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lmndg\" (UID: \"8d0b69ab-8c30-4657-999c-bac2341de0bb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701469 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm2cx\" (UniqueName: \"kubernetes.io/projected/d53682c7-0453-4402-aa06-1724da149b3e-kube-api-access-sm2cx\") pod \"kube-storage-version-migrator-operator-b67b599dd-5b9bq\" (UID: \"d53682c7-0453-4402-aa06-1724da149b3e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701508 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dcacbd95-339a-4b0c-985e-e5daf21c2661-etcd-client\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701525 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67c5c456-b7d7-42b9-842d-eccf213bef77-proxy-tls\") pod \"machine-config-controller-84d6567774-zmvbt\" (UID: \"67c5c456-b7d7-42b9-842d-eccf213bef77\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701541 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xvbd\" (UniqueName: \"kubernetes.io/projected/8d0b69ab-8c30-4657-999c-bac2341de0bb-kube-api-access-2xvbd\") pod \"package-server-manager-789f6589d5-lmndg\" (UID: \"8d0b69ab-8c30-4657-999c-bac2341de0bb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.701556 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9674283-432f-442d-b7f9-f345f5f4da4b-proxy-tls\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: E0121 15:25:23.702687 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:24.202644213 +0000 UTC m=+89.127133965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.703664 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9674283-432f-442d-b7f9-f345f5f4da4b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.704520 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d53682c7-0453-4402-aa06-1724da149b3e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5b9bq\" (UID: \"d53682c7-0453-4402-aa06-1724da149b3e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.709214 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc66079f-3257-43a2-9546-276e039d3442-config-volume\") pod \"dns-default-sc9wz\" (UID: \"cc66079f-3257-43a2-9546-276e039d3442\") " pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.713808 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcacbd95-339a-4b0c-985e-e5daf21c2661-etcd-service-ca\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.718313 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcacbd95-339a-4b0c-985e-e5daf21c2661-config\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.720810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4b6b777c-9c68-48ed-b82f-aa8edf4d3361-signing-cabundle\") pod \"service-ca-9c57cc56f-4kntm\" (UID: \"4b6b777c-9c68-48ed-b82f-aa8edf4d3361\") " pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.722269 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-mountpoint-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.727969 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-csi-data-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.728811 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f84504b-c80c-44eb-a333-ce41e3e2a4f0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hhn45\" (UID: \"4f84504b-c80c-44eb-a333-ce41e3e2a4f0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.732211 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ced1c123-8896-4ab6-9896-d4ddd0de959f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w9wsc\" (UID: \"ced1c123-8896-4ab6-9896-d4ddd0de959f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.732886 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9674283-432f-442d-b7f9-f345f5f4da4b-proxy-tls\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.733813 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c8da7a-ff65-4275-bf72-bdd8da929a4a-service-ca-bundle\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.733900 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67c5c456-b7d7-42b9-842d-eccf213bef77-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zmvbt\" (UID: \"67c5c456-b7d7-42b9-842d-eccf213bef77\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.738351 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4b6b777c-9c68-48ed-b82f-aa8edf4d3361-signing-key\") pod \"service-ca-9c57cc56f-4kntm\" (UID: \"4b6b777c-9c68-48ed-b82f-aa8edf4d3361\") " pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.739903 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcacbd95-339a-4b0c-985e-e5daf21c2661-serving-cert\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.740752 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-registration-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.740889 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-plugins-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.741375 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76efaf19-7338-44b6-9eda-d69bd479a2de-serving-cert\") pod \"service-ca-operator-777779d784-h26dv\" (UID: \"76efaf19-7338-44b6-9eda-d69bd479a2de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.741375 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.741399 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/262910dc-030f-4767-833a-507c1a280963-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zpcds\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.742192 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c3a2458-cc1f-489a-9ce5-57d651ea1754-config-volume\") pod \"collect-profiles-29483475-7ftkl\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.742939 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f299f3-e9af-47b7-9790-c584c36a976f-config\") pod \"kube-controller-manager-operator-78b949d7b-7lcd2\" (UID: \"32f299f3-e9af-47b7-9790-c584c36a976f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.743281 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-certificates\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.744048 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b9674283-432f-442d-b7f9-f345f5f4da4b-images\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.746019 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dcacbd95-339a-4b0c-985e-e5daf21c2661-etcd-ca\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.748473 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32f299f3-e9af-47b7-9790-c584c36a976f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7lcd2\" (UID: \"32f299f3-e9af-47b7-9790-c584c36a976f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.752906 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dcacbd95-339a-4b0c-985e-e5daf21c2661-etcd-client\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.753678 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d0b69ab-8c30-4657-999c-bac2341de0bb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lmndg\" (UID: \"8d0b69ab-8c30-4657-999c-bac2341de0bb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.754302 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/262910dc-030f-4767-833a-507c1a280963-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zpcds\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.754299 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced1c123-8896-4ab6-9896-d4ddd0de959f-config\") pod \"kube-apiserver-operator-766d6c64bb-w9wsc\" (UID: \"ced1c123-8896-4ab6-9896-d4ddd0de959f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.754501 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0e296aaa-2bcb-48cc-98de-ddc913780b66-socket-dir\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.754856 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d53682c7-0453-4402-aa06-1724da149b3e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5b9bq\" (UID: \"d53682c7-0453-4402-aa06-1724da149b3e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.754988 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/44c8da7a-ff65-4275-bf72-bdd8da929a4a-stats-auth\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.755421 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44c8da7a-ff65-4275-bf72-bdd8da929a4a-metrics-certs\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.763343 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cafc9bd5-4993-4fcf-ba6d-91028b10e7e8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7pqk4\" (UID: \"cafc9bd5-4993-4fcf-ba6d-91028b10e7e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.763440 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc66079f-3257-43a2-9546-276e039d3442-metrics-tls\") pod \"dns-default-sc9wz\" (UID: \"cc66079f-3257-43a2-9546-276e039d3442\") " pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.764662 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-trusted-ca\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.764675 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/44c8da7a-ff65-4275-bf72-bdd8da929a4a-default-certificate\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.765283 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76efaf19-7338-44b6-9eda-d69bd479a2de-config\") pod \"service-ca-operator-777779d784-h26dv\" (UID: \"76efaf19-7338-44b6-9eda-d69bd479a2de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.765674 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-trusted-ca\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.766132 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.766251 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-tls\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.766730 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r"] Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.770313 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67c5c456-b7d7-42b9-842d-eccf213bef77-proxy-tls\") pod \"machine-config-controller-84d6567774-zmvbt\" (UID: \"67c5c456-b7d7-42b9-842d-eccf213bef77\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.775586 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c3a2458-cc1f-489a-9ce5-57d651ea1754-secret-volume\") pod \"collect-profiles-29483475-7ftkl\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.776039 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-metrics-tls\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.776424 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxgk6\" (UniqueName: \"kubernetes.io/projected/dcacbd95-339a-4b0c-985e-e5daf21c2661-kube-api-access-wxgk6\") pod \"etcd-operator-b45778765-tmkjf\" (UID: \"dcacbd95-339a-4b0c-985e-e5daf21c2661\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.777548 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f84504b-c80c-44eb-a333-ce41e3e2a4f0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hhn45\" (UID: \"4f84504b-c80c-44eb-a333-ce41e3e2a4f0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.789722 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2m99\" (UniqueName: \"kubernetes.io/projected/8c3a2458-cc1f-489a-9ce5-57d651ea1754-kube-api-access-v2m99\") pod \"collect-profiles-29483475-7ftkl\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.805038 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def72881-27b4-4161-b94b-3507c648a197-cert\") pod \"ingress-canary-dqkqd\" (UID: \"def72881-27b4-4161-b94b-3507c648a197\") " pod="openshift-ingress-canary/ingress-canary-dqkqd" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.805088 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79spw\" (UniqueName: \"kubernetes.io/projected/def72881-27b4-4161-b94b-3507c648a197-kube-api-access-79spw\") pod \"ingress-canary-dqkqd\" (UID: \"def72881-27b4-4161-b94b-3507c648a197\") " pod="openshift-ingress-canary/ingress-canary-dqkqd" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.805142 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.805192 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4b27839b-91f9-4b6f-a70c-87f7f73928b2-certs\") pod \"machine-config-server-2lnn7\" (UID: \"4b27839b-91f9-4b6f-a70c-87f7f73928b2\") " pod="openshift-machine-config-operator/machine-config-server-2lnn7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.805209 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4b27839b-91f9-4b6f-a70c-87f7f73928b2-node-bootstrap-token\") pod \"machine-config-server-2lnn7\" (UID: \"4b27839b-91f9-4b6f-a70c-87f7f73928b2\") " pod="openshift-machine-config-operator/machine-config-server-2lnn7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.805261 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rl7\" (UniqueName: \"kubernetes.io/projected/4b27839b-91f9-4b6f-a70c-87f7f73928b2-kube-api-access-c6rl7\") pod \"machine-config-server-2lnn7\" (UID: \"4b27839b-91f9-4b6f-a70c-87f7f73928b2\") " pod="openshift-machine-config-operator/machine-config-server-2lnn7" Jan 21 15:25:23 crc kubenswrapper[4773]: E0121 15:25:23.812814 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:24.312789224 +0000 UTC m=+89.237278846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.815254 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw"] Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.821045 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4b27839b-91f9-4b6f-a70c-87f7f73928b2-node-bootstrap-token\") pod \"machine-config-server-2lnn7\" (UID: \"4b27839b-91f9-4b6f-a70c-87f7f73928b2\") " pod="openshift-machine-config-operator/machine-config-server-2lnn7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.826273 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf4k9\" (UniqueName: \"kubernetes.io/projected/44c8da7a-ff65-4275-bf72-bdd8da929a4a-kube-api-access-rf4k9\") pod \"router-default-5444994796-xrdf7\" (UID: \"44c8da7a-ff65-4275-bf72-bdd8da929a4a\") " pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.829120 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ced1c123-8896-4ab6-9896-d4ddd0de959f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w9wsc\" (UID: \"ced1c123-8896-4ab6-9896-d4ddd0de959f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.834308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xwn45" event={"ID":"e26a3952-09c7-455b-ac02-a18c778eec8e","Type":"ContainerStarted","Data":"ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f"} Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.834349 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xwn45" event={"ID":"e26a3952-09c7-455b-ac02-a18c778eec8e","Type":"ContainerStarted","Data":"a4f67995b950b15aa0355cdb8768036c75a86a85beb3e8b21376f870a16a2cbf"} Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.837452 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def72881-27b4-4161-b94b-3507c648a197-cert\") pod \"ingress-canary-dqkqd\" (UID: \"def72881-27b4-4161-b94b-3507c648a197\") " pod="openshift-ingress-canary/ingress-canary-dqkqd" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.851275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4b27839b-91f9-4b6f-a70c-87f7f73928b2-certs\") pod \"machine-config-server-2lnn7\" (UID: \"4b27839b-91f9-4b6f-a70c-87f7f73928b2\") " pod="openshift-machine-config-operator/machine-config-server-2lnn7" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.853793 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr7ft\" (UniqueName: \"kubernetes.io/projected/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-kube-api-access-tr7ft\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.858627 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" event={"ID":"1aaad209-017c-4f1a-af2b-7bdb507ed1a0","Type":"ContainerStarted","Data":"88d1f687ac5438dcbfc91b8858b0abcf3b5aa35436dd011dea961bf507719962"} Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.861580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8jg8\" (UniqueName: \"kubernetes.io/projected/4b6b777c-9c68-48ed-b82f-aa8edf4d3361-kube-api-access-q8jg8\") pod \"service-ca-9c57cc56f-4kntm\" (UID: \"4b6b777c-9c68-48ed-b82f-aa8edf4d3361\") " pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.880402 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2"] Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.884490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hw9b\" (UniqueName: \"kubernetes.io/projected/262910dc-030f-4767-833a-507c1a280963-kube-api-access-6hw9b\") pod \"marketplace-operator-79b997595-zpcds\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.897162 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" event={"ID":"18d99fca-5145-431e-8bf1-8934b783b569","Type":"ContainerStarted","Data":"b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff"} Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.897338 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" event={"ID":"18d99fca-5145-431e-8bf1-8934b783b569","Type":"ContainerStarted","Data":"e3a53f51ce3ed98af3f9faa00e5cf1e68e09d0379a5aa02db094840b85e299ff"} Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.897800 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.907062 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.907511 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksc8d\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-kube-api-access-ksc8d\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:23 crc kubenswrapper[4773]: E0121 15:25:23.909303 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:24.40928344 +0000 UTC m=+89.333773062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.920344 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-f5jgd"] Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.921936 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vf566" event={"ID":"454bfb09-e6a5-4e20-af7b-9aa8a52ca678","Type":"ContainerStarted","Data":"e28cdb1e9f56bc00d6bdd140caf76660628845a0ebeaf239fd6cc9bca88d8d38"} Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.928773 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" event={"ID":"69e4506e-0adb-495a-b22d-ff5ac9e79afa","Type":"ContainerStarted","Data":"2970ef81738e0d6320825f7b79a2d6ec41dccc127a126e46defcaf2f5a876e52"} Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.932148 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzcz2\" (UniqueName: \"kubernetes.io/projected/cc790b88-a203-4b87-9c4b-3daa0961208d-kube-api-access-xzcz2\") pod \"migrator-59844c95c7-x4pnk\" (UID: \"cc790b88-a203-4b87-9c4b-3daa0961208d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.934806 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.936198 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb"] Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.941155 4773 generic.go:334] "Generic (PLEG): container finished" podID="bb30bae7-1663-4658-acf9-f76adf8d12ea" containerID="4b35e3d94ffcb2333b82c7038955b74769443fd5e46b7142f3063cdd761ad6af" exitCode=0 Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.941236 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" event={"ID":"bb30bae7-1663-4658-acf9-f76adf8d12ea","Type":"ContainerDied","Data":"4b35e3d94ffcb2333b82c7038955b74769443fd5e46b7142f3063cdd761ad6af"} Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.941270 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" event={"ID":"bb30bae7-1663-4658-acf9-f76adf8d12ea","Type":"ContainerStarted","Data":"ff3b564dfce3718c8c8bca697791b7bf23f4ec6c4f68697a6709f3c97df6a156"} Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.959049 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85lrb\" (UniqueName: \"kubernetes.io/projected/b9674283-432f-442d-b7f9-f345f5f4da4b-kube-api-access-85lrb\") pod \"machine-config-operator-74547568cd-kqcjg\" (UID: \"b9674283-432f-442d-b7f9-f345f5f4da4b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.960971 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdjqt\" (UniqueName: \"kubernetes.io/projected/0e296aaa-2bcb-48cc-98de-ddc913780b66-kube-api-access-rdjqt\") pod \"csi-hostpathplugin-72zzh\" (UID: \"0e296aaa-2bcb-48cc-98de-ddc913780b66\") " pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.978746 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f84504b-c80c-44eb-a333-ce41e3e2a4f0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hhn45\" (UID: \"4f84504b-c80c-44eb-a333-ce41e3e2a4f0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.985837 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.992833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" event={"ID":"8acf9922-28d9-410b-b416-6685314b9964","Type":"ContainerStarted","Data":"820dac3004d5dc10a1ebde766b623990e93b90cce4a651a3dff5ce7ff150df3a"} Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.993804 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" Jan 21 15:25:23 crc kubenswrapper[4773]: I0121 15:25:23.997575 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.009729 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.011552 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl5w7\" (UniqueName: \"kubernetes.io/projected/76efaf19-7338-44b6-9eda-d69bd479a2de-kube-api-access-sl5w7\") pod \"service-ca-operator-777779d784-h26dv\" (UID: \"76efaf19-7338-44b6-9eda-d69bd479a2de\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.013627 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:24.51360857 +0000 UTC m=+89.438098272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.028490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-bound-sa-token\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.031360 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.031868 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" event={"ID":"569e4aab-6b67-4448-9e6e-ecab14ebc87e","Type":"ContainerStarted","Data":"96f758d6c866cabf7dfb5af14d292ebfc07c5a790f0458e46203c777155f592f"} Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.031902 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" event={"ID":"569e4aab-6b67-4448-9e6e-ecab14ebc87e","Type":"ContainerStarted","Data":"d6eb16ba47f75ce1e4b5b711ff861f11fc41f97c3772b6018b79b0c18c16843b"} Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.031933 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" event={"ID":"569e4aab-6b67-4448-9e6e-ecab14ebc87e","Type":"ContainerStarted","Data":"3aa5983b4f7e675c46f0ce83d2c501292a0efeba111d214a8cf90061ce76b620"} Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.054306 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32f299f3-e9af-47b7-9790-c584c36a976f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7lcd2\" (UID: \"32f299f3-e9af-47b7-9790-c584c36a976f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.063848 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lt258"] Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.078369 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ca7aec-6d4f-4411-b972-5abd06ef46f0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xv5z9\" (UID: \"e9ca7aec-6d4f-4411-b972-5abd06ef46f0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.088049 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.097114 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.102719 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghkgg\" (UniqueName: \"kubernetes.io/projected/67c5c456-b7d7-42b9-842d-eccf213bef77-kube-api-access-ghkgg\") pod \"machine-config-controller-84d6567774-zmvbt\" (UID: \"67c5c456-b7d7-42b9-842d-eccf213bef77\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.114721 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.115996 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.132816 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm2cx\" (UniqueName: \"kubernetes.io/projected/d53682c7-0453-4402-aa06-1724da149b3e-kube-api-access-sm2cx\") pod \"kube-storage-version-migrator-operator-b67b599dd-5b9bq\" (UID: \"d53682c7-0453-4402-aa06-1724da149b3e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.138196 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.139274 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.139973 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:24.639942693 +0000 UTC m=+89.564432315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.161177 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-72zzh" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.162814 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g22w\" (UniqueName: \"kubernetes.io/projected/cc66079f-3257-43a2-9546-276e039d3442-kube-api-access-6g22w\") pod \"dns-default-sc9wz\" (UID: \"cc66079f-3257-43a2-9546-276e039d3442\") " pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.175481 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vvt6x"] Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.177743 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq"] Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.219471 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jwjhn"] Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.220582 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.226906 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:24.726888309 +0000 UTC m=+89.651377931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.231318 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.250323 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xvbd\" (UniqueName: \"kubernetes.io/projected/8d0b69ab-8c30-4657-999c-bac2341de0bb-kube-api-access-2xvbd\") pod \"package-server-manager-789f6589d5-lmndg\" (UID: \"8d0b69ab-8c30-4657-999c-bac2341de0bb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.253302 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.254746 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgl8h\" (UniqueName: \"kubernetes.io/projected/cafc9bd5-4993-4fcf-ba6d-91028b10e7e8-kube-api-access-lgl8h\") pod \"control-plane-machine-set-operator-78cbb6b69f-7pqk4\" (UID: \"cafc9bd5-4993-4fcf-ba6d-91028b10e7e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.260224 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6rl7\" (UniqueName: \"kubernetes.io/projected/4b27839b-91f9-4b6f-a70c-87f7f73928b2-kube-api-access-c6rl7\") pod \"machine-config-server-2lnn7\" (UID: \"4b27839b-91f9-4b6f-a70c-87f7f73928b2\") " pod="openshift-machine-config-operator/machine-config-server-2lnn7" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.268555 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79spw\" (UniqueName: \"kubernetes.io/projected/def72881-27b4-4161-b94b-3507c648a197-kube-api-access-79spw\") pod \"ingress-canary-dqkqd\" (UID: \"def72881-27b4-4161-b94b-3507c648a197\") " pod="openshift-ingress-canary/ingress-canary-dqkqd" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.275178 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.301093 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.304982 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf"] Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.307209 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-w5ls2"] Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.309448 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d6phv"] Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.313110 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.323016 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.332387 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.332922 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:24.832893463 +0000 UTC m=+89.757383085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.333561 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.334451 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:24.834437513 +0000 UTC m=+89.758927135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.336067 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g"] Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.358852 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw"] Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.380975 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.431406 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.434458 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.435039 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:24.935012455 +0000 UTC m=+89.859502077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.457147 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dqkqd" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.461840 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2lnn7" Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.542050 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.542572 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.042529287 +0000 UTC m=+89.967018959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.614777 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tmkjf"] Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.643232 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.643776 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.143647714 +0000 UTC m=+90.068137346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.644049 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.644899 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.144883465 +0000 UTC m=+90.069373087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.671448 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg"] Jan 21 15:25:24 crc kubenswrapper[4773]: W0121 15:25:24.673932 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcacbd95_339a_4b0c_985e_e5daf21c2661.slice/crio-27c88dea38070f54832a0a22160ef9bd09dfd74d59a231f7743d6ac74f115ac2 WatchSource:0}: Error finding container 27c88dea38070f54832a0a22160ef9bd09dfd74d59a231f7743d6ac74f115ac2: Status 404 returned error can't find the container with id 27c88dea38070f54832a0a22160ef9bd09dfd74d59a231f7743d6ac74f115ac2 Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.719079 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45"] Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.745236 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.745817 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.245762176 +0000 UTC m=+90.170251798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.746085 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.746807 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.246797282 +0000 UTC m=+90.171286904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.852398 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.860946 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.360907787 +0000 UTC m=+90.285397409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.901161 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-xwn45" podStartSLOduration=65.901137466 podStartE2EDuration="1m5.901137466s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:24.90054708 +0000 UTC m=+89.825036702" watchObservedRunningTime="2026-01-21 15:25:24.901137466 +0000 UTC m=+89.825627088" Jan 21 15:25:24 crc kubenswrapper[4773]: W0121 15:25:24.909878 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9674283_432f_442d_b7f9_f345f5f4da4b.slice/crio-89e8c16a527bd4f75ded39b261fe50f5473a8251a89e661a47ec6c816094523e WatchSource:0}: Error finding container 89e8c16a527bd4f75ded39b261fe50f5473a8251a89e661a47ec6c816094523e: Status 404 returned error can't find the container with id 89e8c16a527bd4f75ded39b261fe50f5473a8251a89e661a47ec6c816094523e Jan 21 15:25:24 crc kubenswrapper[4773]: I0121 15:25:24.963145 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:24 crc kubenswrapper[4773]: E0121 15:25:24.963572 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.463557243 +0000 UTC m=+90.388046865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.031490 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.0314717939999998 podStartE2EDuration="2.031471794s" podCreationTimestamp="2026-01-21 15:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:25.030014566 +0000 UTC m=+89.954504188" watchObservedRunningTime="2026-01-21 15:25:25.031471794 +0000 UTC m=+89.955961416" Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.065295 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:25 crc kubenswrapper[4773]: E0121 15:25:25.065611 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.565594293 +0000 UTC m=+90.490083915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.078215 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" event={"ID":"8a92707c-0d6f-4561-bcda-8997f3c2967d","Type":"ContainerStarted","Data":"9798dfcd9e7739f544f74943f665a4faf7429765fd81b1a493c09a2cdbb17ff0"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.091987 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" event={"ID":"eded1f09-fe44-4693-939c-60335f2d6b22","Type":"ContainerStarted","Data":"0d5a07f7066c4cfa326c59b9d28a564e09abce5f0ec1c87dfaa37c406b255a6b"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.096284 4773 generic.go:334] "Generic (PLEG): container finished" podID="454bfb09-e6a5-4e20-af7b-9aa8a52ca678" containerID="4b77404d779c9121db0f9146ad730cd3d7042ce29acfc4e927a5294f6c1f37e6" exitCode=0 Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.096394 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vf566" event={"ID":"454bfb09-e6a5-4e20-af7b-9aa8a52ca678","Type":"ContainerDied","Data":"4b77404d779c9121db0f9146ad730cd3d7042ce29acfc4e927a5294f6c1f37e6"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.120134 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" event={"ID":"dcacbd95-339a-4b0c-985e-e5daf21c2661","Type":"ContainerStarted","Data":"27c88dea38070f54832a0a22160ef9bd09dfd74d59a231f7743d6ac74f115ac2"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.127533 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-w5ls2" event={"ID":"75a6e760-8173-4942-a194-297cce124b98","Type":"ContainerStarted","Data":"72ac7464dc39a24d6056825dd43fc0ed1e69a8ebeee1c72989c9036c723eff91"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.133350 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" event={"ID":"2e950a56-b252-49e1-b795-4931be982e88","Type":"ContainerStarted","Data":"2ea18a7eca247413623aa41a6b0ce9b64a2bbc678f248191d74ec7db000aa7fa"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.135208 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" event={"ID":"dde388e0-c191-408f-a40e-72c1414c4d14","Type":"ContainerStarted","Data":"b817641dc04cb9a1277f41c2838d2cc6357c50a2cba8f39bc326ec446807b6d2"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.141061 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" event={"ID":"b347bcd3-0e23-40a4-8e27-9140db184474","Type":"ContainerStarted","Data":"95818b4198e5ed350c7a779ddd2f526f7503989dbc4b0d33ba1ad74f84fe351a"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.165280 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" event={"ID":"b9674283-432f-442d-b7f9-f345f5f4da4b","Type":"ContainerStarted","Data":"89e8c16a527bd4f75ded39b261fe50f5473a8251a89e661a47ec6c816094523e"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.166446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:25 crc kubenswrapper[4773]: E0121 15:25:25.166964 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.666950715 +0000 UTC m=+90.591440327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.169112 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-d6phv" event={"ID":"fcb80de5-75af-4316-a192-3ffac092ffd9","Type":"ContainerStarted","Data":"fdcae0d6bff4b010cb2fa9f7e48f51368e25b35538e427bc4032e34d9de545e1"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.188119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" event={"ID":"8acf9922-28d9-410b-b416-6685314b9964","Type":"ContainerStarted","Data":"39d193f58447070087094b4013c5b4dd8d84f483113c80ff4c3627b023b1c894"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.207325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" event={"ID":"8242f19f-45e0-4481-9d43-19305274878b","Type":"ContainerStarted","Data":"bf6cc2e56e611fd5e3bc2e4ec35adc17575a5a95d4f3c971e330b9a31477921f"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.219089 4773 generic.go:334] "Generic (PLEG): container finished" podID="1aaad209-017c-4f1a-af2b-7bdb507ed1a0" containerID="f1d361090e38134f2ba610d7762f2550ad7631bc353f7d1176fe056136c406a9" exitCode=0 Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.219192 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" event={"ID":"1aaad209-017c-4f1a-af2b-7bdb507ed1a0","Type":"ContainerDied","Data":"f1d361090e38134f2ba610d7762f2550ad7631bc353f7d1176fe056136c406a9"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.224498 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" event={"ID":"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b","Type":"ContainerStarted","Data":"90c26e239a14c57fac3f660766e4bce2314f68159b916ddb48adf374764aaef6"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.232065 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" podStartSLOduration=66.232041472 podStartE2EDuration="1m6.232041472s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:25.229770922 +0000 UTC m=+90.154260544" watchObservedRunningTime="2026-01-21 15:25:25.232041472 +0000 UTC m=+90.156531084" Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.233950 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" event={"ID":"bb30bae7-1663-4658-acf9-f76adf8d12ea","Type":"ContainerStarted","Data":"0ea56ddfcf81674e766c263b94ffd25531c8f29ccb4b86fde4b2c80fe956bfb7"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.234091 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.240248 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" event={"ID":"ddeb61c7-7fde-4331-abe7-0dc69b173ee1","Type":"ContainerStarted","Data":"d348756bd3cb1f944a2d28993a34445b05087909c92e0339e907b8f101f95400"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.240298 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" event={"ID":"ddeb61c7-7fde-4331-abe7-0dc69b173ee1","Type":"ContainerStarted","Data":"3bcbfebc9e28f8301a95bee82c247d6a49f4f62408c4b2b0d2a89baf9bb68260"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.267747 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:25 crc kubenswrapper[4773]: E0121 15:25:25.268687 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.768664026 +0000 UTC m=+90.693153648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.289682 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" event={"ID":"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4","Type":"ContainerStarted","Data":"13e70ddffacb6e8cd4d32208cefbbac1d303079a93dac04c10fc9a855962099c"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.300865 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" event={"ID":"69e4506e-0adb-495a-b22d-ff5ac9e79afa","Type":"ContainerStarted","Data":"d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.301115 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.335816 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" event={"ID":"0bb536d4-f4ae-44ac-8477-0d14b97ebe04","Type":"ContainerStarted","Data":"56a1b6782761bbde1274e10a5745fcddc6eb630c184faf7ffcd330e5de939b90"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.335864 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" event={"ID":"0bb536d4-f4ae-44ac-8477-0d14b97ebe04","Type":"ContainerStarted","Data":"f7ebb8193a309dde405800f5918dc35e1655ba2f794563ad8cad9790b69a03d5"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.338943 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.345282 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" event={"ID":"69285d7a-0343-4bd8-a6e5-750bf8051c3e","Type":"ContainerStarted","Data":"71103a9ff33941a86881bac3a7c97bdeac30850cd530058f7060a851e992003e"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.350751 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t6c6p" podStartSLOduration=65.350725146 podStartE2EDuration="1m5.350725146s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:25.313507126 +0000 UTC m=+90.237996748" watchObservedRunningTime="2026-01-21 15:25:25.350725146 +0000 UTC m=+90.275214768" Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.356242 4773 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-68gt2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.356290 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" podUID="0bb536d4-f4ae-44ac-8477-0d14b97ebe04" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.356852 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xrdf7" event={"ID":"44c8da7a-ff65-4275-bf72-bdd8da929a4a","Type":"ContainerStarted","Data":"e9c8a0011c902e4a6968bf0048a3b322a72cdd119263c5ca7e99629a55f34ba2"} Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.371653 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:25 crc kubenswrapper[4773]: E0121 15:25:25.372356 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.872338769 +0000 UTC m=+90.796828391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.472838 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:25 crc kubenswrapper[4773]: E0121 15:25:25.472988 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.972957652 +0000 UTC m=+90.897447274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.473241 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:25 crc kubenswrapper[4773]: E0121 15:25:25.473571 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:25.973559137 +0000 UTC m=+90.898048759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.574586 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:25 crc kubenswrapper[4773]: E0121 15:25:25.575591 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:26.075535396 +0000 UTC m=+91.000025068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.680567 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:25 crc kubenswrapper[4773]: E0121 15:25:25.681485 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:26.181467408 +0000 UTC m=+91.105957030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.789852 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:25 crc kubenswrapper[4773]: E0121 15:25:25.799309 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:26.299271669 +0000 UTC m=+91.223761291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.832937 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4kntm"] Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.834049 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.905270 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl"] Jan 21 15:25:25 crc kubenswrapper[4773]: I0121 15:25:25.906719 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:25 crc kubenswrapper[4773]: E0121 15:25:25.907151 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:26.40713745 +0000 UTC m=+91.331627062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.008283 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.021290 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:26.521247226 +0000 UTC m=+91.445736858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.021496 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" podStartSLOduration=67.021469321 podStartE2EDuration="1m7.021469321s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:26.012009434 +0000 UTC m=+90.936499086" watchObservedRunningTime="2026-01-21 15:25:26.021469321 +0000 UTC m=+90.945958943" Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.022034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.022379 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:26.522367114 +0000 UTC m=+91.446856736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.065079 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" podStartSLOduration=66.065059038 podStartE2EDuration="1m6.065059038s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:26.040683711 +0000 UTC m=+90.965173333" watchObservedRunningTime="2026-01-21 15:25:26.065059038 +0000 UTC m=+90.989548660" Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.075220 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" podStartSLOduration=66.075189911 podStartE2EDuration="1m6.075189911s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:26.068141488 +0000 UTC m=+90.992631130" watchObservedRunningTime="2026-01-21 15:25:26.075189911 +0000 UTC m=+90.999679533" Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.128216 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.129557 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:26.629500417 +0000 UTC m=+91.553990039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.249867 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.250388 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:26.750370118 +0000 UTC m=+91.674859740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.350553 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.350928 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:26.850895339 +0000 UTC m=+91.775384961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.351138 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.351442 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:26.851434762 +0000 UTC m=+91.775924384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.389247 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2lnn7" event={"ID":"4b27839b-91f9-4b6f-a70c-87f7f73928b2","Type":"ContainerStarted","Data":"2f479600686ac38b2770760fb3f162d6496cfe821e384b072d6c8e4ea4d1894b"} Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.403642 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" event={"ID":"4b6b777c-9c68-48ed-b82f-aa8edf4d3361","Type":"ContainerStarted","Data":"1a99c4a2b2f8e1ed4dd49c602481e27779156731248fd06dd8c2382935592886"} Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.406812 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" event={"ID":"8c3a2458-cc1f-489a-9ce5-57d651ea1754","Type":"ContainerStarted","Data":"39aaa8a3e26746c24920423a4c9f0c8e21e02c4d7b59fa8172488da9da23abde"} Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.422783 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" event={"ID":"4f84504b-c80c-44eb-a333-ce41e3e2a4f0","Type":"ContainerStarted","Data":"2b8d6b6ec6471f86681c5a65352a92d3798683e1ae3174c737c8334c50db3ff3"} Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.434772 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-h26dv"] Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.451432 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.453445 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.462844 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:26.962801366 +0000 UTC m=+91.887290988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.494577 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc"] Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.498648 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk"] Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.517074 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dqkqd"] Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.548371 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg"] Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.552332 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-72zzh"] Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.557241 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.559153 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2"] Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.563175 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:27.063155762 +0000 UTC m=+91.987645464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: W0121 15:25:26.564490 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76efaf19_7338_44b6_9eda_d69bd479a2de.slice/crio-4e953779fdb6406acaa0820fe7b272ef35f77903cdc2f30aa32edcdd2821e581 WatchSource:0}: Error finding container 4e953779fdb6406acaa0820fe7b272ef35f77903cdc2f30aa32edcdd2821e581: Status 404 returned error can't find the container with id 4e953779fdb6406acaa0820fe7b272ef35f77903cdc2f30aa32edcdd2821e581 Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.585399 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zpcds"] Jan 21 15:25:26 crc kubenswrapper[4773]: W0121 15:25:26.606135 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podced1c123_8896_4ab6_9896_d4ddd0de959f.slice/crio-645cd7208fdb3b5a3f21201a1a02f90581c5b96ba6c196ea131bf0ccf187b26b WatchSource:0}: Error finding container 645cd7208fdb3b5a3f21201a1a02f90581c5b96ba6c196ea131bf0ccf187b26b: Status 404 returned error can't find the container with id 645cd7208fdb3b5a3f21201a1a02f90581c5b96ba6c196ea131bf0ccf187b26b Jan 21 15:25:26 crc kubenswrapper[4773]: W0121 15:25:26.663786 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d0b69ab_8c30_4657_999c_bac2341de0bb.slice/crio-641b63939eb96782fba306c262badfd24a3badf267df8c3b4da95b3c066679cc WatchSource:0}: Error finding container 641b63939eb96782fba306c262badfd24a3badf267df8c3b4da95b3c066679cc: Status 404 returned error can't find the container with id 641b63939eb96782fba306c262badfd24a3badf267df8c3b4da95b3c066679cc Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.676478 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.677052 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:27.17700567 +0000 UTC m=+92.101495292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.706187 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt"] Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.722167 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9"] Jan 21 15:25:26 crc kubenswrapper[4773]: W0121 15:25:26.769141 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c5c456_b7d7_42b9_842d_eccf213bef77.slice/crio-29bd2c32edec623a88fbcad0934e8fcc9647a1b04f3e40b2e0c5ec6fdd9c5b70 WatchSource:0}: Error finding container 29bd2c32edec623a88fbcad0934e8fcc9647a1b04f3e40b2e0c5ec6fdd9c5b70: Status 404 returned error can't find the container with id 29bd2c32edec623a88fbcad0934e8fcc9647a1b04f3e40b2e0c5ec6fdd9c5b70 Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.781080 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.781624 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:27.281601517 +0000 UTC m=+92.206091139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: W0121 15:25:26.785384 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9ca7aec_6d4f_4411_b972_5abd06ef46f0.slice/crio-fc7af596856b3dc579799ad32ea4a6bd6dd4edf89edb5f7ea606cd3bd8485357 WatchSource:0}: Error finding container fc7af596856b3dc579799ad32ea4a6bd6dd4edf89edb5f7ea606cd3bd8485357: Status 404 returned error can't find the container with id fc7af596856b3dc579799ad32ea4a6bd6dd4edf89edb5f7ea606cd3bd8485357 Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.818047 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4"] Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.823384 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sc9wz"] Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.887281 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.887906 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:27.387887976 +0000 UTC m=+92.312377598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:26 crc kubenswrapper[4773]: I0121 15:25:26.989993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:26 crc kubenswrapper[4773]: E0121 15:25:26.990424 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:27.49041107 +0000 UTC m=+92.414900692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.008420 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq"] Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.091241 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:27 crc kubenswrapper[4773]: E0121 15:25:27.091563 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:27.591524395 +0000 UTC m=+92.516014017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.092342 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:27 crc kubenswrapper[4773]: E0121 15:25:27.093001 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:27.592981073 +0000 UTC m=+92.517470695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.193547 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:27 crc kubenswrapper[4773]: E0121 15:25:27.194151 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:27.69412971 +0000 UTC m=+92.618619332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:27 crc kubenswrapper[4773]: E0121 15:25:27.297437 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:27.797417853 +0000 UTC m=+92.721907475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.297491 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.409337 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:27 crc kubenswrapper[4773]: E0121 15:25:27.409866 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:27.909848383 +0000 UTC m=+92.834337995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.512596 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:27 crc kubenswrapper[4773]: E0121 15:25:27.513076 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:28.013051154 +0000 UTC m=+92.937540776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.535173 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-w5ls2" event={"ID":"75a6e760-8173-4942-a194-297cce124b98","Type":"ContainerStarted","Data":"38aea41fa1167ee1f184a442b319661427bb664681f0dcf895cef85ae5f0fb0d"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.535649 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-w5ls2" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.540749 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" event={"ID":"dde388e0-c191-408f-a40e-72c1414c4d14","Type":"ContainerStarted","Data":"e2f5dc3fdc40bc107366f44fa135e35cc408c26a16ab320422a474dec6887e1e"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.542867 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-w5ls2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.542933 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w5ls2" podUID="75a6e760-8173-4942-a194-297cce124b98" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.545911 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sc9wz" event={"ID":"cc66079f-3257-43a2-9546-276e039d3442","Type":"ContainerStarted","Data":"51b4dff909c54bc1f3e6f0ad36799f0c5d19397689cef3cab24c66d6836a73ae"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.555114 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" event={"ID":"8a92707c-0d6f-4561-bcda-8997f3c2967d","Type":"ContainerStarted","Data":"50b66816653070926571fd2fb65b335591ac03d8f4f5185c0f44bff24bf07cde"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.556441 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.566882 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-w5ls2" podStartSLOduration=68.566849846 podStartE2EDuration="1m8.566849846s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:27.566390314 +0000 UTC m=+92.490879936" watchObservedRunningTime="2026-01-21 15:25:27.566849846 +0000 UTC m=+92.491339478" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.573039 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk" event={"ID":"cc790b88-a203-4b87-9c4b-3daa0961208d","Type":"ContainerStarted","Data":"1cef3f6adcf09d23373f61090055a8b35078c31a85c8ea2671c0d8e0fe7365c4"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.575181 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.579558 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" event={"ID":"69285d7a-0343-4bd8-a6e5-750bf8051c3e","Type":"ContainerStarted","Data":"8065168ed2f8ab95574d474df1fe4b5a66d92749fe19f928476d7405a253106a"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.588852 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6zgnb" podStartSLOduration=67.588814139 podStartE2EDuration="1m7.588814139s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:27.586673923 +0000 UTC m=+92.511163565" watchObservedRunningTime="2026-01-21 15:25:27.588814139 +0000 UTC m=+92.513303761" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.614942 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" event={"ID":"8c3a2458-cc1f-489a-9ce5-57d651ea1754","Type":"ContainerStarted","Data":"b9845560e1a2299efc2f6877bf13cb25b6c6677446a3213edef4dfad7a755dc9"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.616823 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:27 crc kubenswrapper[4773]: E0121 15:25:27.618217 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:28.118198915 +0000 UTC m=+93.042688537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.620074 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fvd2g" podStartSLOduration=68.620049703 podStartE2EDuration="1m8.620049703s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:27.617910048 +0000 UTC m=+92.542399680" watchObservedRunningTime="2026-01-21 15:25:27.620049703 +0000 UTC m=+92.544539325" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.675055 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" event={"ID":"76efaf19-7338-44b6-9eda-d69bd479a2de","Type":"ContainerStarted","Data":"6d82892323e36268afc0f4a55c2a845b21107ed0ad55ea3bf0399fbf7a732eb2"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.675562 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" event={"ID":"76efaf19-7338-44b6-9eda-d69bd479a2de","Type":"ContainerStarted","Data":"4e953779fdb6406acaa0820fe7b272ef35f77903cdc2f30aa32edcdd2821e581"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.683503 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" podStartSLOduration=68.683474457 podStartE2EDuration="1m8.683474457s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:27.682579043 +0000 UTC m=+92.607068665" watchObservedRunningTime="2026-01-21 15:25:27.683474457 +0000 UTC m=+92.607964079" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.684552 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-p95pw" podStartSLOduration=68.684545884 podStartE2EDuration="1m8.684545884s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:27.644301286 +0000 UTC m=+92.568790918" watchObservedRunningTime="2026-01-21 15:25:27.684545884 +0000 UTC m=+92.609035506" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.718741 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:27 crc kubenswrapper[4773]: E0121 15:25:27.720611 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:28.220594854 +0000 UTC m=+93.145084666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.758942 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" event={"ID":"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b","Type":"ContainerStarted","Data":"4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.760017 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.788390 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-h26dv" podStartSLOduration=67.788361981 podStartE2EDuration="1m7.788361981s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:27.787065217 +0000 UTC m=+92.711554849" watchObservedRunningTime="2026-01-21 15:25:27.788361981 +0000 UTC m=+92.712851603" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.823855 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:27 crc kubenswrapper[4773]: E0121 15:25:27.825192 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:28.325172781 +0000 UTC m=+93.249662403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.851763 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" podStartSLOduration=68.851735103 podStartE2EDuration="1m8.851735103s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:27.849571346 +0000 UTC m=+92.774060968" watchObservedRunningTime="2026-01-21 15:25:27.851735103 +0000 UTC m=+92.776224745" Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.892275 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" event={"ID":"4b6b777c-9c68-48ed-b82f-aa8edf4d3361","Type":"ContainerStarted","Data":"cb324b816552fdf441886d7a9e478c00238228807643719d0f49fc3c9a055477"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.914277 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" event={"ID":"eded1f09-fe44-4693-939c-60335f2d6b22","Type":"ContainerStarted","Data":"c960c09640ced66fb2e382a3dfb38e463655969c4046a6b1b769f1adaabdb87c"} Jan 21 15:25:27 crc kubenswrapper[4773]: I0121 15:25:27.932305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:27 crc kubenswrapper[4773]: E0121 15:25:27.933200 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:28.433189296 +0000 UTC m=+93.357678918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.000683 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" event={"ID":"dcacbd95-339a-4b0c-985e-e5daf21c2661","Type":"ContainerStarted","Data":"0d769b9222aa620123df577f4f27088674a61c642df8cff35f11f2c3741a4af6"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.040149 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" event={"ID":"8242f19f-45e0-4481-9d43-19305274878b","Type":"ContainerStarted","Data":"4c8517e37f70cdcb13b084ae220565f73d02ea9f3a3a79f7d0029140a4121143"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.040212 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.136324 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" event={"ID":"2e950a56-b252-49e1-b795-4931be982e88","Type":"ContainerStarted","Data":"8059a890e5f2ab71ea69f10b87cfe51d769a0924be32e6db6a49683fe02c74c5"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.139222 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4" event={"ID":"cafc9bd5-4993-4fcf-ba6d-91028b10e7e8","Type":"ContainerStarted","Data":"28d4a4a3bc9efc6b6ff1133d1626cd626446834592088540a76149e510172793"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.140227 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-tmkjf" podStartSLOduration=69.140200923 podStartE2EDuration="1m9.140200923s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:28.059634302 +0000 UTC m=+92.984123924" watchObservedRunningTime="2026-01-21 15:25:28.140200923 +0000 UTC m=+93.064690545" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.140369 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4kntm" podStartSLOduration=68.140364537 podStartE2EDuration="1m8.140364537s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:27.954996335 +0000 UTC m=+92.879485957" watchObservedRunningTime="2026-01-21 15:25:28.140364537 +0000 UTC m=+93.064854179" Jan 21 15:25:28 crc kubenswrapper[4773]: E0121 15:25:28.162323 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:28.662281139 +0000 UTC m=+93.586770761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.164179 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:28 crc kubenswrapper[4773]: E0121 15:25:28.164617 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:28.664602679 +0000 UTC m=+93.589092301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.164841 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" event={"ID":"67c5c456-b7d7-42b9-842d-eccf213bef77","Type":"ContainerStarted","Data":"29bd2c32edec623a88fbcad0934e8fcc9647a1b04f3e40b2e0c5ec6fdd9c5b70"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.239060 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrkhf" podStartSLOduration=69.239030579 podStartE2EDuration="1m9.239030579s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:28.210014313 +0000 UTC m=+93.134503965" watchObservedRunningTime="2026-01-21 15:25:28.239030579 +0000 UTC m=+93.163520201" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.258987 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.262953 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" event={"ID":"b9674283-432f-442d-b7f9-f345f5f4da4b","Type":"ContainerStarted","Data":"f2bea9d851dc51451612741155e5d4206dcbcd56fff2fd3cd75e973379b8b6e0"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.269263 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:28 crc kubenswrapper[4773]: E0121 15:25:28.270460 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:28.770440128 +0000 UTC m=+93.694929750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.271399 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:28 crc kubenswrapper[4773]: E0121 15:25:28.271951 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:28.771940607 +0000 UTC m=+93.696430229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.333074 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" event={"ID":"ced1c123-8896-4ab6-9896-d4ddd0de959f","Type":"ContainerStarted","Data":"8406e398cfb02667a9659b50c1f7cf2c748cb252fde492024f26db5a78e41af8"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.333131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" event={"ID":"ced1c123-8896-4ab6-9896-d4ddd0de959f","Type":"ContainerStarted","Data":"645cd7208fdb3b5a3f21201a1a02f90581c5b96ba6c196ea131bf0ccf187b26b"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.353386 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" event={"ID":"8acf9922-28d9-410b-b416-6685314b9964","Type":"ContainerStarted","Data":"5da6d7493dddcec9383f9c22514321ff60e9f832f4593a8d03254f9a624aae63"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.372937 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:28 crc kubenswrapper[4773]: E0121 15:25:28.374244 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:28.874221613 +0000 UTC m=+93.798711245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.377096 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" event={"ID":"8d0b69ab-8c30-4657-999c-bac2341de0bb","Type":"ContainerStarted","Data":"c67867e0dfc07a31629a9050af250d4a7812afa7e60246c0162bc5b2e1e04c56"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.377163 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" event={"ID":"8d0b69ab-8c30-4657-999c-bac2341de0bb","Type":"ContainerStarted","Data":"641b63939eb96782fba306c262badfd24a3badf267df8c3b4da95b3c066679cc"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.392557 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" podStartSLOduration=68.392536161 podStartE2EDuration="1m8.392536161s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:28.296684812 +0000 UTC m=+93.221174434" watchObservedRunningTime="2026-01-21 15:25:28.392536161 +0000 UTC m=+93.317025783" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.393040 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4" podStartSLOduration=68.393033904 podStartE2EDuration="1m8.393033904s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:28.36526101 +0000 UTC m=+93.289750622" watchObservedRunningTime="2026-01-21 15:25:28.393033904 +0000 UTC m=+93.317523536" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.413255 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" event={"ID":"cd6b003a-6a86-41a8-849c-c2c30fdcdbf4","Type":"ContainerStarted","Data":"1079e0cf6bdf62ae46719d9231131c62ebe37f54d505417fbb4bca664a63d820"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.417253 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" event={"ID":"4f84504b-c80c-44eb-a333-ce41e3e2a4f0","Type":"ContainerStarted","Data":"f7e4675b256b19e03cfeb2ff1d28cf9ac4383c554e57d7afa47d9f8772a983a4"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.435288 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-72zzh" event={"ID":"0e296aaa-2bcb-48cc-98de-ddc913780b66","Type":"ContainerStarted","Data":"bc4437c7fe8e9d6a47ae67e816901b3751cc3dcd5a0dd7544b7ebcc3d74cbeac"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.457214 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-d6phv" event={"ID":"fcb80de5-75af-4316-a192-3ffac092ffd9","Type":"ContainerStarted","Data":"03bbb11424fd54a3222564c929d08d508668fea09301bc79497a51473ba11e55"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.458665 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.483801 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:28 crc kubenswrapper[4773]: E0121 15:25:28.489651 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:28.989630342 +0000 UTC m=+93.914120164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.549794 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-d6phv" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.561893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" event={"ID":"1aaad209-017c-4f1a-af2b-7bdb507ed1a0","Type":"ContainerStarted","Data":"23e3135e8e46d290961bd2cb6c22f7d73cea4b83b19590f13158c78174b4a90c"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.588404 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:28 crc kubenswrapper[4773]: E0121 15:25:28.589413 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:29.089394653 +0000 UTC m=+94.013884275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.593523 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vf566" event={"ID":"454bfb09-e6a5-4e20-af7b-9aa8a52ca678","Type":"ContainerStarted","Data":"af3b4b5d9bc381ec6142bc7d962d254ccd875c743c54b3326edb92acb9ece0f2"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.595500 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xrdf7" event={"ID":"44c8da7a-ff65-4275-bf72-bdd8da929a4a","Type":"ContainerStarted","Data":"ead17481c029f337a8b211edce3fe9f235f25f2c12013861635aaaab111efcaa"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.626413 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" event={"ID":"262910dc-030f-4767-833a-507c1a280963","Type":"ContainerStarted","Data":"627103faf0c6509df47db9e80c33cd18a7247e2da096d8ec1466fac844b0ff2f"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.627258 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.632849 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zpcds container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.632897 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" podUID="262910dc-030f-4767-833a-507c1a280963" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.666781 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" podStartSLOduration=68.666758459 podStartE2EDuration="1m8.666758459s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:28.614062165 +0000 UTC m=+93.538551787" watchObservedRunningTime="2026-01-21 15:25:28.666758459 +0000 UTC m=+93.591248091" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.667865 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9wsc" podStartSLOduration=69.667859378 podStartE2EDuration="1m9.667859378s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:28.666136853 +0000 UTC m=+93.590626475" watchObservedRunningTime="2026-01-21 15:25:28.667859378 +0000 UTC m=+93.592349000" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.691749 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" event={"ID":"ddeb61c7-7fde-4331-abe7-0dc69b173ee1","Type":"ContainerStarted","Data":"802990c5e7418c2da4d6d1be29e3dee25de40f4c1decb464c4c6f02feff418ef"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.693079 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:28 crc kubenswrapper[4773]: E0121 15:25:28.702307 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:29.202287875 +0000 UTC m=+94.126777557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.725116 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" event={"ID":"b347bcd3-0e23-40a4-8e27-9140db184474","Type":"ContainerStarted","Data":"aad44e5977535f1c29a6acf5ddac28ec5ecc9b5a1b28b185460a814b24a2fe39"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.748840 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2lnn7" event={"ID":"4b27839b-91f9-4b6f-a70c-87f7f73928b2","Type":"ContainerStarted","Data":"f8b7566c8f73a2f179d9128ab1c3a0cc0ef0fcc4186e1ea0e9d29414bcd3d8ed"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.755384 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tnfh2" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.756288 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" event={"ID":"e9ca7aec-6d4f-4411-b972-5abd06ef46f0","Type":"ContainerStarted","Data":"fc7af596856b3dc579799ad32ea4a6bd6dd4edf89edb5f7ea606cd3bd8485357"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.757486 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dqkqd" event={"ID":"def72881-27b4-4161-b94b-3507c648a197","Type":"ContainerStarted","Data":"c8c00ee33e1390573f1918c7998cb5ce68e487a52a56d08a3505a6121cd83576"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.796089 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:28 crc kubenswrapper[4773]: E0121 15:25:28.797639 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:29.2976201 +0000 UTC m=+94.222109732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.798024 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" event={"ID":"32f299f3-e9af-47b7-9790-c584c36a976f","Type":"ContainerStarted","Data":"6bc8ed66ea851af82d73201c7ab7e37416c3730e8baa830c305aaa752cbc4e41"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.801331 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-df4cw" podStartSLOduration=69.801314437 podStartE2EDuration="1m9.801314437s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:28.743098529 +0000 UTC m=+93.667588161" watchObservedRunningTime="2026-01-21 15:25:28.801314437 +0000 UTC m=+93.725804059" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.803196 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x4rrg"] Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.805340 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.806449 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" event={"ID":"d53682c7-0453-4402-aa06-1724da149b3e","Type":"ContainerStarted","Data":"3198b24f5559eab64a6d730978a8a2168c75ce2132d10101c5ebb204cc26b374"} Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.817568 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" podStartSLOduration=68.81753945 podStartE2EDuration="1m8.81753945s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:28.815750203 +0000 UTC m=+93.740239825" watchObservedRunningTime="2026-01-21 15:25:28.81753945 +0000 UTC m=+93.742029072" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.834047 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.847269 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x4rrg"] Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.872514 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hhn45" podStartSLOduration=69.872492462 podStartE2EDuration="1m9.872492462s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:28.87240283 +0000 UTC m=+93.796892452" watchObservedRunningTime="2026-01-21 15:25:28.872492462 +0000 UTC m=+93.796982084" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.907435 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-utilities\") pod \"community-operators-x4rrg\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.907518 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rptdp\" (UniqueName: \"kubernetes.io/projected/d36b150f-af27-41a9-b699-db2207d44d58-kube-api-access-rptdp\") pod \"community-operators-x4rrg\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.907590 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-catalog-content\") pod \"community-operators-x4rrg\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.907729 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:28 crc kubenswrapper[4773]: E0121 15:25:28.916130 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:29.416108049 +0000 UTC m=+94.340597671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.937355 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vzncg"] Jan 21 15:25:28 crc kubenswrapper[4773]: I0121 15:25:28.981310 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.002286 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.011197 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.012498 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-utilities\") pod \"community-operators-x4rrg\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.012812 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rptdp\" (UniqueName: \"kubernetes.io/projected/d36b150f-af27-41a9-b699-db2207d44d58-kube-api-access-rptdp\") pod \"community-operators-x4rrg\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.012991 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-catalog-content\") pod \"community-operators-x4rrg\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.013151 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-catalog-content\") pod \"certified-operators-vzncg\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.013338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsflj\" (UniqueName: \"kubernetes.io/projected/3d6badc3-8b6a-4308-84b9-a6a1d6460878-kube-api-access-rsflj\") pod \"certified-operators-vzncg\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.013547 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-utilities\") pod \"certified-operators-vzncg\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:25:29 crc kubenswrapper[4773]: E0121 15:25:29.013867 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:29.513833997 +0000 UTC m=+94.438323619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.015232 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-catalog-content\") pod \"community-operators-x4rrg\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.015955 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.017998 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-utilities\") pod \"community-operators-x4rrg\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.023659 4773 patch_prober.go:28] interesting pod/router-default-5444994796-xrdf7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:25:29 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 21 15:25:29 crc kubenswrapper[4773]: [+]process-running ok Jan 21 15:25:29 crc kubenswrapper[4773]: healthz check failed Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.023753 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrdf7" podUID="44c8da7a-ff65-4275-bf72-bdd8da929a4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.060309 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-vf566" podStartSLOduration=70.060282068 podStartE2EDuration="1m10.060282068s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:29.013449027 +0000 UTC m=+93.937938669" watchObservedRunningTime="2026-01-21 15:25:29.060282068 +0000 UTC m=+93.984771690" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.079263 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" podStartSLOduration=69.079239232 podStartE2EDuration="1m9.079239232s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:29.077184918 +0000 UTC m=+94.001674540" watchObservedRunningTime="2026-01-21 15:25:29.079239232 +0000 UTC m=+94.003728854" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.079510 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzncg"] Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.116334 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-catalog-content\") pod \"certified-operators-vzncg\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.116735 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.116783 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsflj\" (UniqueName: \"kubernetes.io/projected/3d6badc3-8b6a-4308-84b9-a6a1d6460878-kube-api-access-rsflj\") pod \"certified-operators-vzncg\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.116852 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-utilities\") pod \"certified-operators-vzncg\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.117396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-utilities\") pod \"certified-operators-vzncg\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:25:29 crc kubenswrapper[4773]: E0121 15:25:29.117514 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:29.617495199 +0000 UTC m=+94.541984821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.117932 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-catalog-content\") pod \"certified-operators-vzncg\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.126457 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rptdp\" (UniqueName: \"kubernetes.io/projected/d36b150f-af27-41a9-b699-db2207d44d58-kube-api-access-rptdp\") pod \"community-operators-x4rrg\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.163561 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsflj\" (UniqueName: \"kubernetes.io/projected/3d6badc3-8b6a-4308-84b9-a6a1d6460878-kube-api-access-rsflj\") pod \"certified-operators-vzncg\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.165570 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-vvt6x" podStartSLOduration=70.165539861 podStartE2EDuration="1m10.165539861s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:29.153010585 +0000 UTC m=+94.077500227" watchObservedRunningTime="2026-01-21 15:25:29.165539861 +0000 UTC m=+94.090029483" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.184663 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.185460 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fzdjk"] Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.186791 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.187545 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fzdjk"] Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.214035 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-d6phv" podStartSLOduration=70.214011065 podStartE2EDuration="1m10.214011065s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:29.20730943 +0000 UTC m=+94.131799062" watchObservedRunningTime="2026-01-21 15:25:29.214011065 +0000 UTC m=+94.138500687" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.219625 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.219837 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-utilities\") pod \"community-operators-fzdjk\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.219957 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld2fc\" (UniqueName: \"kubernetes.io/projected/9289cbf3-df59-4b6d-890f-d213b42bd96b-kube-api-access-ld2fc\") pod \"community-operators-fzdjk\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.219989 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-catalog-content\") pod \"community-operators-fzdjk\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:25:29 crc kubenswrapper[4773]: E0121 15:25:29.220095 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:29.720076393 +0000 UTC m=+94.644566015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.240852 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k25lw" podStartSLOduration=70.240832374 podStartE2EDuration="1m10.240832374s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:29.240183528 +0000 UTC m=+94.164673160" watchObservedRunningTime="2026-01-21 15:25:29.240832374 +0000 UTC m=+94.165321996" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.273853 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xrdf7" podStartSLOduration=70.273834225 podStartE2EDuration="1m10.273834225s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:29.272616833 +0000 UTC m=+94.197106455" watchObservedRunningTime="2026-01-21 15:25:29.273834225 +0000 UTC m=+94.198323847" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.299587 4773 csr.go:261] certificate signing request csr-ndc6j is approved, waiting to be issued Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.318569 4773 csr.go:257] certificate signing request csr-ndc6j is issued Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.325216 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t6ld8"] Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.329957 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.339081 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.339237 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld2fc\" (UniqueName: \"kubernetes.io/projected/9289cbf3-df59-4b6d-890f-d213b42bd96b-kube-api-access-ld2fc\") pod \"community-operators-fzdjk\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.339274 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-catalog-content\") pod \"community-operators-fzdjk\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.339345 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-utilities\") pod \"community-operators-fzdjk\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.339896 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-utilities\") pod \"community-operators-fzdjk\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:25:29 crc kubenswrapper[4773]: E0121 15:25:29.340209 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:29.840195104 +0000 UTC m=+94.764684726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.345058 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-catalog-content\") pod \"community-operators-fzdjk\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.354203 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dqkqd" podStartSLOduration=9.354166909 podStartE2EDuration="9.354166909s" podCreationTimestamp="2026-01-21 15:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:29.351889939 +0000 UTC m=+94.276379561" watchObservedRunningTime="2026-01-21 15:25:29.354166909 +0000 UTC m=+94.278656531" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.361655 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6ld8"] Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.364091 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.386643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld2fc\" (UniqueName: \"kubernetes.io/projected/9289cbf3-df59-4b6d-890f-d213b42bd96b-kube-api-access-ld2fc\") pod \"community-operators-fzdjk\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.439045 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-2lnn7" podStartSLOduration=9.439003301 podStartE2EDuration="9.439003301s" podCreationTimestamp="2026-01-21 15:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:29.395858616 +0000 UTC m=+94.320348258" watchObservedRunningTime="2026-01-21 15:25:29.439003301 +0000 UTC m=+94.363492923" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.440288 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:29 crc kubenswrapper[4773]: E0121 15:25:29.440826 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:29.940659474 +0000 UTC m=+94.865149096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.442242 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-utilities\") pod \"certified-operators-t6ld8\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.442366 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8hg\" (UniqueName: \"kubernetes.io/projected/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-kube-api-access-nh8hg\") pod \"certified-operators-t6ld8\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.442432 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.442498 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-catalog-content\") pod \"certified-operators-t6ld8\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:25:29 crc kubenswrapper[4773]: E0121 15:25:29.443174 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:29.943164518 +0000 UTC m=+94.867654140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.536728 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" podStartSLOduration=70.536685656 podStartE2EDuration="1m10.536685656s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:29.536674876 +0000 UTC m=+94.461164518" watchObservedRunningTime="2026-01-21 15:25:29.536685656 +0000 UTC m=+94.461175278" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.538533 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" podStartSLOduration=70.538523635 podStartE2EDuration="1m10.538523635s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:29.504410385 +0000 UTC m=+94.428900007" watchObservedRunningTime="2026-01-21 15:25:29.538523635 +0000 UTC m=+94.463013257" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.543452 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.543791 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-utilities\") pod \"certified-operators-t6ld8\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.543857 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8hg\" (UniqueName: \"kubernetes.io/projected/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-kube-api-access-nh8hg\") pod \"certified-operators-t6ld8\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.543921 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-catalog-content\") pod \"certified-operators-t6ld8\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.544313 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-catalog-content\") pod \"certified-operators-t6ld8\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:25:29 crc kubenswrapper[4773]: E0121 15:25:29.544396 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:30.044375707 +0000 UTC m=+94.968865329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.544651 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-utilities\") pod \"certified-operators-t6ld8\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.549551 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.590976 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8hg\" (UniqueName: \"kubernetes.io/projected/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-kube-api-access-nh8hg\") pod \"certified-operators-t6ld8\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.648497 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:29 crc kubenswrapper[4773]: E0121 15:25:29.659787 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:30.159734994 +0000 UTC m=+95.084224626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.688121 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.750305 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:29 crc kubenswrapper[4773]: E0121 15:25:29.750895 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:30.25087628 +0000 UTC m=+95.175365902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.855114 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:29 crc kubenswrapper[4773]: E0121 15:25:29.855996 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:30.35598051 +0000 UTC m=+95.280470132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.930136 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" event={"ID":"67c5c456-b7d7-42b9-842d-eccf213bef77","Type":"ContainerStarted","Data":"1f41e481b349e9923e97953749e2537def773c4bd68bf61fda5811131deed223"} Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.930237 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" event={"ID":"67c5c456-b7d7-42b9-842d-eccf213bef77","Type":"ContainerStarted","Data":"3e23b7df059a0cb6449b735df25c78e8fe2b59bc650e069d73a68173877277b0"} Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.966513 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zmvbt" podStartSLOduration=69.966489801 podStartE2EDuration="1m9.966489801s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:29.965778073 +0000 UTC m=+94.890267705" watchObservedRunningTime="2026-01-21 15:25:29.966489801 +0000 UTC m=+94.890979423" Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.967070 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:29 crc kubenswrapper[4773]: E0121 15:25:29.968321 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:30.468294978 +0000 UTC m=+95.392784600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.975065 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sc9wz" event={"ID":"cc66079f-3257-43a2-9546-276e039d3442","Type":"ContainerStarted","Data":"a96f5831a7c46c56509f5f5bc7c4c0e8fbfc28be78b22216e68c643e1b03d56f"} Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.975139 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sc9wz" event={"ID":"cc66079f-3257-43a2-9546-276e039d3442","Type":"ContainerStarted","Data":"94c13524b7cf95f828e9a7142da03bb331b10022effeeaace7bc1c2328ce4295"} Jan 21 15:25:29 crc kubenswrapper[4773]: I0121 15:25:29.977162 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.007781 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" event={"ID":"eded1f09-fe44-4693-939c-60335f2d6b22","Type":"ContainerStarted","Data":"cd96fc264a4e81d71e3a31631aa92b51dddccb9d1cb2d955ecd75b41700ffec4"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.016453 4773 patch_prober.go:28] interesting pod/router-default-5444994796-xrdf7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:25:30 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 21 15:25:30 crc kubenswrapper[4773]: [+]process-running ok Jan 21 15:25:30 crc kubenswrapper[4773]: healthz check failed Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.016515 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrdf7" podUID="44c8da7a-ff65-4275-bf72-bdd8da929a4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.058424 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" event={"ID":"262910dc-030f-4767-833a-507c1a280963","Type":"ContainerStarted","Data":"ef3bd6511f8e7459721d629cd0705d7ca2707cdde18393d9f251bb354eca849d"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.059472 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sc9wz" podStartSLOduration=10.059451435 podStartE2EDuration="10.059451435s" podCreationTimestamp="2026-01-21 15:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:30.021289189 +0000 UTC m=+94.945778831" watchObservedRunningTime="2026-01-21 15:25:30.059451435 +0000 UTC m=+94.983941057" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.059648 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zpcds container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.059737 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" podUID="262910dc-030f-4767-833a-507c1a280963" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.068598 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.069989 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-jwjhn" podStartSLOduration=71.069950108 podStartE2EDuration="1m11.069950108s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:30.058371606 +0000 UTC m=+94.982861228" watchObservedRunningTime="2026-01-21 15:25:30.069950108 +0000 UTC m=+94.994439730" Jan 21 15:25:30 crc kubenswrapper[4773]: E0121 15:25:30.072393 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:30.572372611 +0000 UTC m=+95.496862443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.100730 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kqcjg" event={"ID":"b9674283-432f-442d-b7f9-f345f5f4da4b","Type":"ContainerStarted","Data":"2fb50fc16a2aed7c7c9782cf7fa6a86590fb21aa6087473b1ab97ca7c2f68e3c"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.152166 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk" event={"ID":"cc790b88-a203-4b87-9c4b-3daa0961208d","Type":"ContainerStarted","Data":"b71b9335d552b7f6ee071399669d93893568868604d9e2a7a449adce2752141e"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.152237 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk" event={"ID":"cc790b88-a203-4b87-9c4b-3daa0961208d","Type":"ContainerStarted","Data":"11c24fbd66100c2cbac01b604be85c339b59046ac1b8eb52ec0112a7fba2408b"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.169284 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:30 crc kubenswrapper[4773]: E0121 15:25:30.169906 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:30.669887784 +0000 UTC m=+95.594377396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.197625 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dqkqd" event={"ID":"def72881-27b4-4161-b94b-3507c648a197","Type":"ContainerStarted","Data":"0f8b37b7cafe2bd1f5d04f16adc290ce65966c2df0789377ab9ab1f687bfd6ba"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.253298 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" event={"ID":"8d0b69ab-8c30-4657-999c-bac2341de0bb","Type":"ContainerStarted","Data":"004cb2313eb2984c64d700669a9e586ef3e73507c747427f713764dd303aa945"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.254512 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.270528 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:30 crc kubenswrapper[4773]: E0121 15:25:30.271025 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:30.77100915 +0000 UTC m=+95.695498772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.299137 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vf566" event={"ID":"454bfb09-e6a5-4e20-af7b-9aa8a52ca678","Type":"ContainerStarted","Data":"dcf9ee5d1535872e18be6a47a692e374fba1b648e118aebeef4dc31e9c2e63ca"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.320951 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-21 15:20:29 +0000 UTC, rotation deadline is 2026-11-12 03:42:12.948458008 +0000 UTC Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.321014 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7068h16m42.627447466s for next certificate rotation Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.322170 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x4pnk" podStartSLOduration=71.322142842 podStartE2EDuration="1m11.322142842s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:30.218164962 +0000 UTC m=+95.142654584" watchObservedRunningTime="2026-01-21 15:25:30.322142842 +0000 UTC m=+95.246632464" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.339488 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" event={"ID":"b347bcd3-0e23-40a4-8e27-9140db184474","Type":"ContainerStarted","Data":"bac2ee0eba2ed593c86449d3a85ce40928429900eccec04ae02a98b8eddf65f8"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.373929 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:30 crc kubenswrapper[4773]: E0121 15:25:30.374102 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:30.874070716 +0000 UTC m=+95.798560338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.375334 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:30 crc kubenswrapper[4773]: E0121 15:25:30.379388 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:30.879365634 +0000 UTC m=+95.803855526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.385152 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-72zzh" event={"ID":"0e296aaa-2bcb-48cc-98de-ddc913780b66","Type":"ContainerStarted","Data":"c6765d23a00ba163156211e83f0b1bd882612625f91a03c6bf7e1595b57fa2db"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.412922 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7pqk4" event={"ID":"cafc9bd5-4993-4fcf-ba6d-91028b10e7e8","Type":"ContainerStarted","Data":"6b634eac437bd4564768398ed89d8d56cc33c8b47cf0610aab4d56451b976f9d"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.440112 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7lcd2" event={"ID":"32f299f3-e9af-47b7-9790-c584c36a976f","Type":"ContainerStarted","Data":"7d9d0f9d164d6ba05d31dbf4592d23fe2f74056f593c1b60bc80669f2b445de1"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.449530 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" podStartSLOduration=70.449506752 podStartE2EDuration="1m10.449506752s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:30.342473412 +0000 UTC m=+95.266963044" watchObservedRunningTime="2026-01-21 15:25:30.449506752 +0000 UTC m=+95.373996374" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.456134 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzncg"] Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.480162 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:30 crc kubenswrapper[4773]: E0121 15:25:30.482128 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:30.982109183 +0000 UTC m=+95.906598805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.483330 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" event={"ID":"e9ca7aec-6d4f-4411-b972-5abd06ef46f0","Type":"ContainerStarted","Data":"c1ea812b8ed7d0afd0816ba7d81037932c7c3c6f8d91410f0f2c6e433a5f1ef9"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.483390 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" event={"ID":"e9ca7aec-6d4f-4411-b972-5abd06ef46f0","Type":"ContainerStarted","Data":"3b231915918c177d9c5a7dd7fffac7698d16a69e575791ff29449050662ef80f"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.503153 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5b9bq" event={"ID":"d53682c7-0453-4402-aa06-1724da149b3e","Type":"ContainerStarted","Data":"7d0a5997474e5902a0c232e2203308b22f2fa0755c31e246ca9d3837f5b60063"} Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.504889 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-w5ls2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.505040 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w5ls2" podUID="75a6e760-8173-4942-a194-297cce124b98" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.506246 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.516940 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-f5jgd" podStartSLOduration=70.516920849 podStartE2EDuration="1m10.516920849s" podCreationTimestamp="2026-01-21 15:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:30.494215568 +0000 UTC m=+95.418705190" watchObservedRunningTime="2026-01-21 15:25:30.516920849 +0000 UTC m=+95.441410471" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.554036 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-946gq" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.567859 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x4rrg"] Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.582942 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:30 crc kubenswrapper[4773]: E0121 15:25:30.593387 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:31.093368453 +0000 UTC m=+96.017858075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.648819 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fzdjk"] Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.661341 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xv5z9" podStartSLOduration=71.661313144 podStartE2EDuration="1m11.661313144s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:30.641843237 +0000 UTC m=+95.566332859" watchObservedRunningTime="2026-01-21 15:25:30.661313144 +0000 UTC m=+95.585802756" Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.710508 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:30 crc kubenswrapper[4773]: E0121 15:25:30.718344 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:31.218302729 +0000 UTC m=+96.142792361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.759374 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:30 crc kubenswrapper[4773]: E0121 15:25:30.776809 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:31.276782974 +0000 UTC m=+96.201272586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.885628 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:30 crc kubenswrapper[4773]: E0121 15:25:30.886200 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:31.386180806 +0000 UTC m=+96.310670428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.935122 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6ld8"] Jan 21 15:25:30 crc kubenswrapper[4773]: I0121 15:25:30.987893 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:30 crc kubenswrapper[4773]: E0121 15:25:30.988349 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:31.488331949 +0000 UTC m=+96.412821571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.012392 4773 patch_prober.go:28] interesting pod/router-default-5444994796-xrdf7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:25:31 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 21 15:25:31 crc kubenswrapper[4773]: [+]process-running ok Jan 21 15:25:31 crc kubenswrapper[4773]: healthz check failed Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.012881 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrdf7" podUID="44c8da7a-ff65-4275-bf72-bdd8da929a4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.088947 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:31 crc kubenswrapper[4773]: E0121 15:25:31.089528 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:31.589504916 +0000 UTC m=+96.513994538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.121915 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qr2xs"] Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.123160 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.126084 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.146243 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qr2xs"] Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.191269 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:31 crc kubenswrapper[4773]: E0121 15:25:31.191820 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:31.691803313 +0000 UTC m=+96.616292935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.251291 4773 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.292219 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.292594 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr2sg\" (UniqueName: \"kubernetes.io/projected/f28ee43e-c39a-4033-a36b-01a987f6c85e-kube-api-access-xr2sg\") pod \"redhat-marketplace-qr2xs\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.292732 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-catalog-content\") pod \"redhat-marketplace-qr2xs\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.292762 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-utilities\") pod \"redhat-marketplace-qr2xs\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:25:31 crc kubenswrapper[4773]: E0121 15:25:31.292970 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:31.79294702 +0000 UTC m=+96.717436642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.393549 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-catalog-content\") pod \"redhat-marketplace-qr2xs\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.393603 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-utilities\") pod \"redhat-marketplace-qr2xs\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.393685 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.393727 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr2sg\" (UniqueName: \"kubernetes.io/projected/f28ee43e-c39a-4033-a36b-01a987f6c85e-kube-api-access-xr2sg\") pod \"redhat-marketplace-qr2xs\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.394499 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-catalog-content\") pod \"redhat-marketplace-qr2xs\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.394851 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-utilities\") pod \"redhat-marketplace-qr2xs\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:25:31 crc kubenswrapper[4773]: E0121 15:25:31.395138 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 15:25:31.895124693 +0000 UTC m=+96.819614315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cb2vm" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.417197 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr2sg\" (UniqueName: \"kubernetes.io/projected/f28ee43e-c39a-4033-a36b-01a987f6c85e-kube-api-access-xr2sg\") pod \"redhat-marketplace-qr2xs\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.494681 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:31 crc kubenswrapper[4773]: E0121 15:25:31.496374 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 15:25:31.996350732 +0000 UTC m=+96.920840354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.507872 4773 generic.go:334] "Generic (PLEG): container finished" podID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerID="d536bb88e9cc0c29635c99980a9a90544f09b3ac8da4b899b464264d56a9ab79" exitCode=0 Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.507938 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6ld8" event={"ID":"dbec81d4-f8ff-45fa-b7f8-b29709e477bb","Type":"ContainerDied","Data":"d536bb88e9cc0c29635c99980a9a90544f09b3ac8da4b899b464264d56a9ab79"} Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.507967 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6ld8" event={"ID":"dbec81d4-f8ff-45fa-b7f8-b29709e477bb","Type":"ContainerStarted","Data":"8bebf2c7d597e7fc57e68d0ea813a13094364c93ed1f82872a127e56da6016d4"} Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.510161 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.511057 4773 generic.go:334] "Generic (PLEG): container finished" podID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" containerID="1768459ed151d20ff8d317cb6e7dab6c8df171533708f3859a2b9ce48c6e2a4f" exitCode=0 Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.511098 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzncg" event={"ID":"3d6badc3-8b6a-4308-84b9-a6a1d6460878","Type":"ContainerDied","Data":"1768459ed151d20ff8d317cb6e7dab6c8df171533708f3859a2b9ce48c6e2a4f"} Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.511117 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzncg" event={"ID":"3d6badc3-8b6a-4308-84b9-a6a1d6460878","Type":"ContainerStarted","Data":"d9e777a5afb1411614059f5afd95f1b74c7451607f755421597315778eb02ed0"} Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.513825 4773 generic.go:334] "Generic (PLEG): container finished" podID="d36b150f-af27-41a9-b699-db2207d44d58" containerID="1b63ab3dc83648f1114ef5a532e065a6842ad97d041ad8b3f805363e53d2a26b" exitCode=0 Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.513875 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4rrg" event={"ID":"d36b150f-af27-41a9-b699-db2207d44d58","Type":"ContainerDied","Data":"1b63ab3dc83648f1114ef5a532e065a6842ad97d041ad8b3f805363e53d2a26b"} Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.513893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4rrg" event={"ID":"d36b150f-af27-41a9-b699-db2207d44d58","Type":"ContainerStarted","Data":"55c6c00e57fb64e6fc5f4e11cbe6f1b2d5bb5e8e5528a7d4f119a0c105dfe87c"} Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.515730 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-72zzh" event={"ID":"0e296aaa-2bcb-48cc-98de-ddc913780b66","Type":"ContainerStarted","Data":"87575c5eb8065d4e07c5a43cab8f1a4a405479d065d9665850547f093aeacc95"} Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.517372 4773 generic.go:334] "Generic (PLEG): container finished" podID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerID="0fe1baca2236af71ff20ae6c95d0231e4a9fe2967cae31c8c4f01a1aa79c8665" exitCode=0 Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.517417 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzdjk" event={"ID":"9289cbf3-df59-4b6d-890f-d213b42bd96b","Type":"ContainerDied","Data":"0fe1baca2236af71ff20ae6c95d0231e4a9fe2967cae31c8c4f01a1aa79c8665"} Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.517434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzdjk" event={"ID":"9289cbf3-df59-4b6d-890f-d213b42bd96b","Type":"ContainerStarted","Data":"1f4dd542a9801bb4fa4a63fd8c9f28d461953eadfe7cc8718e8d5582b5faf890"} Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.525467 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ldb72"] Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.526393 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.531449 4773 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-21T15:25:31.251341915Z","Handler":null,"Name":""} Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.538104 4773 generic.go:334] "Generic (PLEG): container finished" podID="8c3a2458-cc1f-489a-9ce5-57d651ea1754" containerID="b9845560e1a2299efc2f6877bf13cb25b6c6677446a3213edef4dfad7a755dc9" exitCode=0 Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.538155 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" event={"ID":"8c3a2458-cc1f-489a-9ce5-57d651ea1754","Type":"ContainerDied","Data":"b9845560e1a2299efc2f6877bf13cb25b6c6677446a3213edef4dfad7a755dc9"} Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.542481 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.545040 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ldb72"] Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.572024 4773 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.572080 4773 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.578542 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.597421 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-catalog-content\") pod \"redhat-marketplace-ldb72\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.597852 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvhn6\" (UniqueName: \"kubernetes.io/projected/378f5d0d-bcef-44be-b03a-b29b3ea33329-kube-api-access-zvhn6\") pod \"redhat-marketplace-ldb72\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.598088 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.598284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-utilities\") pod \"redhat-marketplace-ldb72\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.670302 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.670349 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.705011 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvhn6\" (UniqueName: \"kubernetes.io/projected/378f5d0d-bcef-44be-b03a-b29b3ea33329-kube-api-access-zvhn6\") pod \"redhat-marketplace-ldb72\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.706488 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-utilities\") pod \"redhat-marketplace-ldb72\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.706877 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-catalog-content\") pod \"redhat-marketplace-ldb72\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.714055 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-utilities\") pod \"redhat-marketplace-ldb72\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.716271 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-catalog-content\") pod \"redhat-marketplace-ldb72\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.744987 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvhn6\" (UniqueName: \"kubernetes.io/projected/378f5d0d-bcef-44be-b03a-b29b3ea33329-kube-api-access-zvhn6\") pod \"redhat-marketplace-ldb72\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.801033 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cb2vm\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.808440 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.857915 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.874477 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.929369 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z6mw7"] Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.930545 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.934873 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 15:25:31 crc kubenswrapper[4773]: I0121 15:25:31.946889 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6mw7"] Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.005125 4773 patch_prober.go:28] interesting pod/router-default-5444994796-xrdf7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:25:32 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 21 15:25:32 crc kubenswrapper[4773]: [+]process-running ok Jan 21 15:25:32 crc kubenswrapper[4773]: healthz check failed Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.005205 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrdf7" podUID="44c8da7a-ff65-4275-bf72-bdd8da929a4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.017525 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7cmz\" (UniqueName: \"kubernetes.io/projected/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-kube-api-access-r7cmz\") pod \"redhat-operators-z6mw7\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.017616 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-utilities\") pod \"redhat-operators-z6mw7\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.017722 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-catalog-content\") pod \"redhat-operators-z6mw7\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.026988 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qr2xs"] Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.087937 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.129434 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7cmz\" (UniqueName: \"kubernetes.io/projected/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-kube-api-access-r7cmz\") pod \"redhat-operators-z6mw7\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.129497 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-utilities\") pod \"redhat-operators-z6mw7\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.129531 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-catalog-content\") pod \"redhat-operators-z6mw7\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.130115 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-catalog-content\") pod \"redhat-operators-z6mw7\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.130591 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-utilities\") pod \"redhat-operators-z6mw7\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.137268 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wvqn8"] Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.138676 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.151038 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvqn8"] Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.168557 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7cmz\" (UniqueName: \"kubernetes.io/projected/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-kube-api-access-r7cmz\") pod \"redhat-operators-z6mw7\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.230565 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-catalog-content\") pod \"redhat-operators-wvqn8\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.230849 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-utilities\") pod \"redhat-operators-wvqn8\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.230908 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78hmf\" (UniqueName: \"kubernetes.io/projected/46dfa17b-ab20-4d55-933d-2ee7869977c5-kube-api-access-78hmf\") pod \"redhat-operators-wvqn8\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.284347 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.322864 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ldb72"] Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.332279 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-utilities\") pod \"redhat-operators-wvqn8\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.332359 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78hmf\" (UniqueName: \"kubernetes.io/projected/46dfa17b-ab20-4d55-933d-2ee7869977c5-kube-api-access-78hmf\") pod \"redhat-operators-wvqn8\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.332540 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-catalog-content\") pod \"redhat-operators-wvqn8\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.332989 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-utilities\") pod \"redhat-operators-wvqn8\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.333561 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-catalog-content\") pod \"redhat-operators-wvqn8\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.392265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78hmf\" (UniqueName: \"kubernetes.io/projected/46dfa17b-ab20-4d55-933d-2ee7869977c5-kube-api-access-78hmf\") pod \"redhat-operators-wvqn8\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.491479 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cb2vm"] Jan 21 15:25:32 crc kubenswrapper[4773]: W0121 15:25:32.516560 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f29f37_d4e3_4767_8077_45ffbf1ddd6d.slice/crio-5f3603d112dc821a9501675598e35d1ceefbff7a584f774951e2ae2d419ff3ae WatchSource:0}: Error finding container 5f3603d112dc821a9501675598e35d1ceefbff7a584f774951e2ae2d419ff3ae: Status 404 returned error can't find the container with id 5f3603d112dc821a9501675598e35d1ceefbff7a584f774951e2ae2d419ff3ae Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.548870 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-72zzh" event={"ID":"0e296aaa-2bcb-48cc-98de-ddc913780b66","Type":"ContainerStarted","Data":"f0dfbd748b1470b71ca498460ea85c496d1a9445de6ae4deec1436f1a7f83b6a"} Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.550160 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qr2xs" event={"ID":"f28ee43e-c39a-4033-a36b-01a987f6c85e","Type":"ContainerStarted","Data":"f193b9dd42b8876645d8072220ea6dae62bbf7e63fca216fbb9a7b760c9a5b2b"} Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.551280 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" event={"ID":"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d","Type":"ContainerStarted","Data":"5f3603d112dc821a9501675598e35d1ceefbff7a584f774951e2ae2d419ff3ae"} Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.552566 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ldb72" event={"ID":"378f5d0d-bcef-44be-b03a-b29b3ea33329","Type":"ContainerStarted","Data":"a682892a8e40069cdb7386b6d507533a9f48687b210e4af8c3fb8163ade6850c"} Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.560235 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.585595 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6mw7"] Jan 21 15:25:32 crc kubenswrapper[4773]: I0121 15:25:32.990272 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.008130 4773 patch_prober.go:28] interesting pod/router-default-5444994796-xrdf7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:25:33 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 21 15:25:33 crc kubenswrapper[4773]: [+]process-running ok Jan 21 15:25:33 crc kubenswrapper[4773]: healthz check failed Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.008211 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrdf7" podUID="44c8da7a-ff65-4275-bf72-bdd8da929a4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.055392 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c3a2458-cc1f-489a-9ce5-57d651ea1754-secret-volume\") pod \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.055497 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2m99\" (UniqueName: \"kubernetes.io/projected/8c3a2458-cc1f-489a-9ce5-57d651ea1754-kube-api-access-v2m99\") pod \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.055522 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c3a2458-cc1f-489a-9ce5-57d651ea1754-config-volume\") pod \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\" (UID: \"8c3a2458-cc1f-489a-9ce5-57d651ea1754\") " Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.057221 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c3a2458-cc1f-489a-9ce5-57d651ea1754-config-volume" (OuterVolumeSpecName: "config-volume") pod "8c3a2458-cc1f-489a-9ce5-57d651ea1754" (UID: "8c3a2458-cc1f-489a-9ce5-57d651ea1754"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.064032 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c3a2458-cc1f-489a-9ce5-57d651ea1754-kube-api-access-v2m99" (OuterVolumeSpecName: "kube-api-access-v2m99") pod "8c3a2458-cc1f-489a-9ce5-57d651ea1754" (UID: "8c3a2458-cc1f-489a-9ce5-57d651ea1754"). InnerVolumeSpecName "kube-api-access-v2m99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.064360 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3a2458-cc1f-489a-9ce5-57d651ea1754-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8c3a2458-cc1f-489a-9ce5-57d651ea1754" (UID: "8c3a2458-cc1f-489a-9ce5-57d651ea1754"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.087227 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.088794 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.096391 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.155069 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.155139 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.157544 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c3a2458-cc1f-489a-9ce5-57d651ea1754-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.157573 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2m99\" (UniqueName: \"kubernetes.io/projected/8c3a2458-cc1f-489a-9ce5-57d651ea1754-kube-api-access-v2m99\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.157583 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c3a2458-cc1f-489a-9ce5-57d651ea1754-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.159471 4773 patch_prober.go:28] interesting pod/console-f9d7485db-xwn45 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.159536 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-xwn45" podUID="e26a3952-09c7-455b-ac02-a18c778eec8e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.231766 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.233331 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.257640 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.402187 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.476417 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-w5ls2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.476489 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-w5ls2" podUID="75a6e760-8173-4942-a194-297cce124b98" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.476536 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-w5ls2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.476611 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w5ls2" podUID="75a6e760-8173-4942-a194-297cce124b98" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.552473 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvqn8"] Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.570058 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-72zzh" event={"ID":"0e296aaa-2bcb-48cc-98de-ddc913780b66","Type":"ContainerStarted","Data":"295310f7f37ba0fcf5919af24dba97bf2e61b4d788055d89762b82f3ab5b7115"} Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.576202 4773 generic.go:334] "Generic (PLEG): container finished" podID="f28ee43e-c39a-4033-a36b-01a987f6c85e" containerID="c0c83841ea929110342dca06c276be385d406550c40ee586024029c5c04ae923" exitCode=0 Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.576302 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qr2xs" event={"ID":"f28ee43e-c39a-4033-a36b-01a987f6c85e","Type":"ContainerDied","Data":"c0c83841ea929110342dca06c276be385d406550c40ee586024029c5c04ae923"} Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.588416 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" event={"ID":"8c3a2458-cc1f-489a-9ce5-57d651ea1754","Type":"ContainerDied","Data":"39aaa8a3e26746c24920423a4c9f0c8e21e02c4d7b59fa8172488da9da23abde"} Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.588481 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39aaa8a3e26746c24920423a4c9f0c8e21e02c4d7b59fa8172488da9da23abde" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.588505 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl" Jan 21 15:25:33 crc kubenswrapper[4773]: W0121 15:25:33.599897 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46dfa17b_ab20_4d55_933d_2ee7869977c5.slice/crio-2f31ee880ccbdbfae41beea24f14893eded9ed9787878db1b1d3b029a927879f WatchSource:0}: Error finding container 2f31ee880ccbdbfae41beea24f14893eded9ed9787878db1b1d3b029a927879f: Status 404 returned error can't find the container with id 2f31ee880ccbdbfae41beea24f14893eded9ed9787878db1b1d3b029a927879f Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.605764 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-72zzh" podStartSLOduration=13.605731929 podStartE2EDuration="13.605731929s" podCreationTimestamp="2026-01-21 15:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:33.60037653 +0000 UTC m=+98.524866172" watchObservedRunningTime="2026-01-21 15:25:33.605731929 +0000 UTC m=+98.530221551" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.628885 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" event={"ID":"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d","Type":"ContainerStarted","Data":"13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8"} Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.629457 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.635145 4773 generic.go:334] "Generic (PLEG): container finished" podID="378f5d0d-bcef-44be-b03a-b29b3ea33329" containerID="5a88ffe167296f03d3bda0db85b1ec5f40dc8a9e941085efc506a662abaefc3d" exitCode=0 Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.635236 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ldb72" event={"ID":"378f5d0d-bcef-44be-b03a-b29b3ea33329","Type":"ContainerDied","Data":"5a88ffe167296f03d3bda0db85b1ec5f40dc8a9e941085efc506a662abaefc3d"} Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.640449 4773 generic.go:334] "Generic (PLEG): container finished" podID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerID="d91b2014071a56219bfa4c05b48c0308b984013dd969cc1612279804668b37d1" exitCode=0 Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.640720 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mw7" event={"ID":"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad","Type":"ContainerDied","Data":"d91b2014071a56219bfa4c05b48c0308b984013dd969cc1612279804668b37d1"} Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.640785 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mw7" event={"ID":"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad","Type":"ContainerStarted","Data":"915245cf347a80f9507681cc30b1a6c6073759a8aeb245e6ca3c688c5688b2b9"} Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.649022 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-vf566" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.653063 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pzhf4" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.669051 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" podStartSLOduration=74.669020009 podStartE2EDuration="1m14.669020009s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:33.659231584 +0000 UTC m=+98.583721216" watchObservedRunningTime="2026-01-21 15:25:33.669020009 +0000 UTC m=+98.593509631" Jan 21 15:25:33 crc kubenswrapper[4773]: I0121 15:25:33.998713 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:34 crc kubenswrapper[4773]: I0121 15:25:34.018587 4773 patch_prober.go:28] interesting pod/router-default-5444994796-xrdf7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:25:34 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 21 15:25:34 crc kubenswrapper[4773]: [+]process-running ok Jan 21 15:25:34 crc kubenswrapper[4773]: healthz check failed Jan 21 15:25:34 crc kubenswrapper[4773]: I0121 15:25:34.018711 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrdf7" podUID="44c8da7a-ff65-4275-bf72-bdd8da929a4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:25:34 crc kubenswrapper[4773]: I0121 15:25:34.652260 4773 generic.go:334] "Generic (PLEG): container finished" podID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerID="2cd69b88a9c5eeb915c0250c7ed5792c5f09d12f8871f2015a7843dcf23434d4" exitCode=0 Jan 21 15:25:34 crc kubenswrapper[4773]: I0121 15:25:34.654129 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvqn8" event={"ID":"46dfa17b-ab20-4d55-933d-2ee7869977c5","Type":"ContainerDied","Data":"2cd69b88a9c5eeb915c0250c7ed5792c5f09d12f8871f2015a7843dcf23434d4"} Jan 21 15:25:34 crc kubenswrapper[4773]: I0121 15:25:34.654168 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvqn8" event={"ID":"46dfa17b-ab20-4d55-933d-2ee7869977c5","Type":"ContainerStarted","Data":"2f31ee880ccbdbfae41beea24f14893eded9ed9787878db1b1d3b029a927879f"} Jan 21 15:25:35 crc kubenswrapper[4773]: I0121 15:25:35.002039 4773 patch_prober.go:28] interesting pod/router-default-5444994796-xrdf7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:25:35 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 21 15:25:35 crc kubenswrapper[4773]: [+]process-running ok Jan 21 15:25:35 crc kubenswrapper[4773]: healthz check failed Jan 21 15:25:35 crc kubenswrapper[4773]: I0121 15:25:35.002295 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrdf7" podUID="44c8da7a-ff65-4275-bf72-bdd8da929a4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:25:35 crc kubenswrapper[4773]: I0121 15:25:35.828086 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 15:25:35 crc kubenswrapper[4773]: E0121 15:25:35.828397 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c3a2458-cc1f-489a-9ce5-57d651ea1754" containerName="collect-profiles" Jan 21 15:25:35 crc kubenswrapper[4773]: I0121 15:25:35.828412 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c3a2458-cc1f-489a-9ce5-57d651ea1754" containerName="collect-profiles" Jan 21 15:25:35 crc kubenswrapper[4773]: I0121 15:25:35.828532 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c3a2458-cc1f-489a-9ce5-57d651ea1754" containerName="collect-profiles" Jan 21 15:25:35 crc kubenswrapper[4773]: I0121 15:25:35.829227 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:25:35 crc kubenswrapper[4773]: I0121 15:25:35.835155 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 21 15:25:35 crc kubenswrapper[4773]: I0121 15:25:35.836603 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 21 15:25:35 crc kubenswrapper[4773]: I0121 15:25:35.858761 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 15:25:35 crc kubenswrapper[4773]: I0121 15:25:35.930209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:25:35 crc kubenswrapper[4773]: I0121 15:25:35.930740 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:25:36 crc kubenswrapper[4773]: I0121 15:25:36.000376 4773 patch_prober.go:28] interesting pod/router-default-5444994796-xrdf7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 15:25:36 crc kubenswrapper[4773]: [+]has-synced ok Jan 21 15:25:36 crc kubenswrapper[4773]: [+]process-running ok Jan 21 15:25:36 crc kubenswrapper[4773]: healthz check failed Jan 21 15:25:36 crc kubenswrapper[4773]: I0121 15:25:36.000445 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrdf7" podUID="44c8da7a-ff65-4275-bf72-bdd8da929a4a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:25:36 crc kubenswrapper[4773]: I0121 15:25:36.037543 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:25:36 crc kubenswrapper[4773]: I0121 15:25:36.037676 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:25:36 crc kubenswrapper[4773]: I0121 15:25:36.037801 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:25:36 crc kubenswrapper[4773]: I0121 15:25:36.057686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:25:36 crc kubenswrapper[4773]: I0121 15:25:36.206119 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:25:36 crc kubenswrapper[4773]: I0121 15:25:36.778407 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 15:25:37 crc kubenswrapper[4773]: I0121 15:25:37.005505 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:37 crc kubenswrapper[4773]: I0121 15:25:37.011683 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xrdf7" Jan 21 15:25:37 crc kubenswrapper[4773]: I0121 15:25:37.711006 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a","Type":"ContainerStarted","Data":"2c50d3a6dccccc27e28d57c5f5c53c817ea201f73ef961e0488d90c8184b2255"} Jan 21 15:25:37 crc kubenswrapper[4773]: I0121 15:25:37.869743 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:37 crc kubenswrapper[4773]: I0121 15:25:37.879759 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a01fed4-2691-453e-b74f-c000d5125b53-metrics-certs\") pod \"network-metrics-daemon-8n66g\" (UID: \"1a01fed4-2691-453e-b74f-c000d5125b53\") " pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:37 crc kubenswrapper[4773]: I0121 15:25:37.990572 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8n66g" Jan 21 15:25:37 crc kubenswrapper[4773]: I0121 15:25:37.996592 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 15:25:37 crc kubenswrapper[4773]: I0121 15:25:37.999310 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.007362 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.007662 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.009641 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.073931 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ca6b1c3e-f521-4135-9f75-ab172a0471c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.074058 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ca6b1c3e-f521-4135-9f75-ab172a0471c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.176106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ca6b1c3e-f521-4135-9f75-ab172a0471c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.176247 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ca6b1c3e-f521-4135-9f75-ab172a0471c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.176545 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ca6b1c3e-f521-4135-9f75-ab172a0471c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.206131 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ca6b1c3e-f521-4135-9f75-ab172a0471c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.337492 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.477619 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8n66g"] Jan 21 15:25:38 crc kubenswrapper[4773]: I0121 15:25:38.753458 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8n66g" event={"ID":"1a01fed4-2691-453e-b74f-c000d5125b53","Type":"ContainerStarted","Data":"e4e3fcd7efa8d9f94d92f08cf67edb617755d3a53149faadb60d7b201bcd23e4"} Jan 21 15:25:39 crc kubenswrapper[4773]: I0121 15:25:39.248194 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 15:25:39 crc kubenswrapper[4773]: W0121 15:25:39.304642 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podca6b1c3e_f521_4135_9f75_ab172a0471c8.slice/crio-693221ef7fcb17fdd769ec59817fd3f981504b78201c4bcac4e67c672f9df142 WatchSource:0}: Error finding container 693221ef7fcb17fdd769ec59817fd3f981504b78201c4bcac4e67c672f9df142: Status 404 returned error can't find the container with id 693221ef7fcb17fdd769ec59817fd3f981504b78201c4bcac4e67c672f9df142 Jan 21 15:25:39 crc kubenswrapper[4773]: I0121 15:25:39.438385 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sc9wz" Jan 21 15:25:39 crc kubenswrapper[4773]: I0121 15:25:39.795924 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a","Type":"ContainerStarted","Data":"d8baa9caa2f49941b6d47019bffedcbdefec2299960e3a1df20375ffe041391b"} Jan 21 15:25:39 crc kubenswrapper[4773]: I0121 15:25:39.800863 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ca6b1c3e-f521-4135-9f75-ab172a0471c8","Type":"ContainerStarted","Data":"693221ef7fcb17fdd769ec59817fd3f981504b78201c4bcac4e67c672f9df142"} Jan 21 15:25:39 crc kubenswrapper[4773]: I0121 15:25:39.831300 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.831283929 podStartE2EDuration="4.831283929s" podCreationTimestamp="2026-01-21 15:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:25:39.829474032 +0000 UTC m=+104.753963654" watchObservedRunningTime="2026-01-21 15:25:39.831283929 +0000 UTC m=+104.755773551" Jan 21 15:25:40 crc kubenswrapper[4773]: I0121 15:25:40.816362 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8n66g" event={"ID":"1a01fed4-2691-453e-b74f-c000d5125b53","Type":"ContainerStarted","Data":"d8d0f879d772ef31a414a5cbaff66ee383500e23c7465b6ec0ce145ec84fda62"} Jan 21 15:25:41 crc kubenswrapper[4773]: I0121 15:25:41.851139 4773 generic.go:334] "Generic (PLEG): container finished" podID="2caf7fd4-2767-4ed6-a5cc-888d0b9d472a" containerID="d8baa9caa2f49941b6d47019bffedcbdefec2299960e3a1df20375ffe041391b" exitCode=0 Jan 21 15:25:41 crc kubenswrapper[4773]: I0121 15:25:41.851427 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a","Type":"ContainerDied","Data":"d8baa9caa2f49941b6d47019bffedcbdefec2299960e3a1df20375ffe041391b"} Jan 21 15:25:41 crc kubenswrapper[4773]: I0121 15:25:41.872393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ca6b1c3e-f521-4135-9f75-ab172a0471c8","Type":"ContainerStarted","Data":"e517568a08bb5c19444efb14c71340b1438724e2591e6d6226d1ca1241100197"} Jan 21 15:25:42 crc kubenswrapper[4773]: I0121 15:25:42.888067 4773 generic.go:334] "Generic (PLEG): container finished" podID="ca6b1c3e-f521-4135-9f75-ab172a0471c8" containerID="e517568a08bb5c19444efb14c71340b1438724e2591e6d6226d1ca1241100197" exitCode=0 Jan 21 15:25:42 crc kubenswrapper[4773]: I0121 15:25:42.888122 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ca6b1c3e-f521-4135-9f75-ab172a0471c8","Type":"ContainerDied","Data":"e517568a08bb5c19444efb14c71340b1438724e2591e6d6226d1ca1241100197"} Jan 21 15:25:43 crc kubenswrapper[4773]: I0121 15:25:43.159026 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:43 crc kubenswrapper[4773]: I0121 15:25:43.164235 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:25:43 crc kubenswrapper[4773]: I0121 15:25:43.494420 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-w5ls2" Jan 21 15:25:50 crc kubenswrapper[4773]: I0121 15:25:50.232314 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:25:52 crc kubenswrapper[4773]: I0121 15:25:52.094250 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:25:57 crc kubenswrapper[4773]: I0121 15:25:57.485879 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:25:57 crc kubenswrapper[4773]: I0121 15:25:57.574920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kube-api-access\") pod \"ca6b1c3e-f521-4135-9f75-ab172a0471c8\" (UID: \"ca6b1c3e-f521-4135-9f75-ab172a0471c8\") " Jan 21 15:25:57 crc kubenswrapper[4773]: I0121 15:25:57.575017 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kubelet-dir\") pod \"ca6b1c3e-f521-4135-9f75-ab172a0471c8\" (UID: \"ca6b1c3e-f521-4135-9f75-ab172a0471c8\") " Jan 21 15:25:57 crc kubenswrapper[4773]: I0121 15:25:57.575253 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ca6b1c3e-f521-4135-9f75-ab172a0471c8" (UID: "ca6b1c3e-f521-4135-9f75-ab172a0471c8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:25:57 crc kubenswrapper[4773]: I0121 15:25:57.584400 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ca6b1c3e-f521-4135-9f75-ab172a0471c8" (UID: "ca6b1c3e-f521-4135-9f75-ab172a0471c8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:25:57 crc kubenswrapper[4773]: I0121 15:25:57.676372 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:57 crc kubenswrapper[4773]: I0121 15:25:57.676407 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca6b1c3e-f521-4135-9f75-ab172a0471c8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:25:58 crc kubenswrapper[4773]: I0121 15:25:58.025600 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ca6b1c3e-f521-4135-9f75-ab172a0471c8","Type":"ContainerDied","Data":"693221ef7fcb17fdd769ec59817fd3f981504b78201c4bcac4e67c672f9df142"} Jan 21 15:25:58 crc kubenswrapper[4773]: I0121 15:25:58.025647 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="693221ef7fcb17fdd769ec59817fd3f981504b78201c4bcac4e67c672f9df142" Jan 21 15:25:58 crc kubenswrapper[4773]: I0121 15:25:58.025721 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 15:26:04 crc kubenswrapper[4773]: I0121 15:26:04.258518 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lmndg" Jan 21 15:26:10 crc kubenswrapper[4773]: I0121 15:26:10.568716 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:26:10 crc kubenswrapper[4773]: I0121 15:26:10.733620 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kube-api-access\") pod \"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a\" (UID: \"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a\") " Jan 21 15:26:10 crc kubenswrapper[4773]: I0121 15:26:10.733869 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kubelet-dir\") pod \"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a\" (UID: \"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a\") " Jan 21 15:26:10 crc kubenswrapper[4773]: I0121 15:26:10.734013 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2caf7fd4-2767-4ed6-a5cc-888d0b9d472a" (UID: "2caf7fd4-2767-4ed6-a5cc-888d0b9d472a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:26:10 crc kubenswrapper[4773]: I0121 15:26:10.734274 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:10 crc kubenswrapper[4773]: I0121 15:26:10.738772 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2caf7fd4-2767-4ed6-a5cc-888d0b9d472a" (UID: "2caf7fd4-2767-4ed6-a5cc-888d0b9d472a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:10 crc kubenswrapper[4773]: I0121 15:26:10.835101 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2caf7fd4-2767-4ed6-a5cc-888d0b9d472a-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:11 crc kubenswrapper[4773]: I0121 15:26:11.115639 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2caf7fd4-2767-4ed6-a5cc-888d0b9d472a","Type":"ContainerDied","Data":"2c50d3a6dccccc27e28d57c5f5c53c817ea201f73ef961e0488d90c8184b2255"} Jan 21 15:26:11 crc kubenswrapper[4773]: I0121 15:26:11.115681 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c50d3a6dccccc27e28d57c5f5c53c817ea201f73ef961e0488d90c8184b2255" Jan 21 15:26:11 crc kubenswrapper[4773]: I0121 15:26:11.115967 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.393038 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 15:26:15 crc kubenswrapper[4773]: E0121 15:26:15.393626 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caf7fd4-2767-4ed6-a5cc-888d0b9d472a" containerName="pruner" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.393640 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caf7fd4-2767-4ed6-a5cc-888d0b9d472a" containerName="pruner" Jan 21 15:26:15 crc kubenswrapper[4773]: E0121 15:26:15.393656 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6b1c3e-f521-4135-9f75-ab172a0471c8" containerName="pruner" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.393662 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6b1c3e-f521-4135-9f75-ab172a0471c8" containerName="pruner" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.393783 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6b1c3e-f521-4135-9f75-ab172a0471c8" containerName="pruner" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.393806 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2caf7fd4-2767-4ed6-a5cc-888d0b9d472a" containerName="pruner" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.394192 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.397547 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.397730 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.404986 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.511116 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8314c771-0440-4c78-a512-6a7febec659b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8314c771-0440-4c78-a512-6a7febec659b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.511295 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8314c771-0440-4c78-a512-6a7febec659b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8314c771-0440-4c78-a512-6a7febec659b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.613266 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8314c771-0440-4c78-a512-6a7febec659b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8314c771-0440-4c78-a512-6a7febec659b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.613341 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8314c771-0440-4c78-a512-6a7febec659b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8314c771-0440-4c78-a512-6a7febec659b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.613478 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8314c771-0440-4c78-a512-6a7febec659b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8314c771-0440-4c78-a512-6a7febec659b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.633346 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8314c771-0440-4c78-a512-6a7febec659b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8314c771-0440-4c78-a512-6a7febec659b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:26:15 crc kubenswrapper[4773]: I0121 15:26:15.713141 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:26:19 crc kubenswrapper[4773]: E0121 15:26:19.684826 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 21 15:26:19 crc kubenswrapper[4773]: E0121 15:26:19.685426 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nh8hg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-t6ld8_openshift-marketplace(dbec81d4-f8ff-45fa-b7f8-b29709e477bb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 15:26:19 crc kubenswrapper[4773]: E0121 15:26:19.686624 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-t6ld8" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" Jan 21 15:26:19 crc kubenswrapper[4773]: I0121 15:26:19.794361 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 15:26:19 crc kubenswrapper[4773]: I0121 15:26:19.795593 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:26:19 crc kubenswrapper[4773]: I0121 15:26:19.800326 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 15:26:19 crc kubenswrapper[4773]: I0121 15:26:19.884121 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71445799-2ffe-4318-8753-3b8801b2db52-kube-api-access\") pod \"installer-9-crc\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:26:19 crc kubenswrapper[4773]: I0121 15:26:19.884206 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-var-lock\") pod \"installer-9-crc\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:26:19 crc kubenswrapper[4773]: I0121 15:26:19.884235 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-kubelet-dir\") pod \"installer-9-crc\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:26:19 crc kubenswrapper[4773]: I0121 15:26:19.986538 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71445799-2ffe-4318-8753-3b8801b2db52-kube-api-access\") pod \"installer-9-crc\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:26:19 crc kubenswrapper[4773]: I0121 15:26:19.986634 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-var-lock\") pod \"installer-9-crc\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:26:19 crc kubenswrapper[4773]: I0121 15:26:19.986668 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-kubelet-dir\") pod \"installer-9-crc\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:26:19 crc kubenswrapper[4773]: I0121 15:26:19.986792 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-kubelet-dir\") pod \"installer-9-crc\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:26:19 crc kubenswrapper[4773]: I0121 15:26:19.986912 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-var-lock\") pod \"installer-9-crc\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:26:20 crc kubenswrapper[4773]: I0121 15:26:20.006746 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71445799-2ffe-4318-8753-3b8801b2db52-kube-api-access\") pod \"installer-9-crc\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:26:20 crc kubenswrapper[4773]: I0121 15:26:20.115946 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.402084 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.402210 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.402282 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.402401 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.405739 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.406406 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.408923 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.421363 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.423085 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.428524 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.433980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.435314 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.469623 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.484048 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:26:21 crc kubenswrapper[4773]: I0121 15:26:21.500312 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 15:26:21 crc kubenswrapper[4773]: E0121 15:26:21.633172 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-t6ld8" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" Jan 21 15:26:21 crc kubenswrapper[4773]: E0121 15:26:21.702458 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 21 15:26:21 crc kubenswrapper[4773]: E0121 15:26:21.703058 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rptdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-x4rrg_openshift-marketplace(d36b150f-af27-41a9-b699-db2207d44d58): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 15:26:21 crc kubenswrapper[4773]: E0121 15:26:21.704212 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-x4rrg" podUID="d36b150f-af27-41a9-b699-db2207d44d58" Jan 21 15:26:23 crc kubenswrapper[4773]: E0121 15:26:23.009297 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-x4rrg" podUID="d36b150f-af27-41a9-b699-db2207d44d58" Jan 21 15:26:23 crc kubenswrapper[4773]: E0121 15:26:23.061539 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 21 15:26:23 crc kubenswrapper[4773]: E0121 15:26:23.062001 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zvhn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ldb72_openshift-marketplace(378f5d0d-bcef-44be-b03a-b29b3ea33329): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 15:26:23 crc kubenswrapper[4773]: E0121 15:26:23.063535 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ldb72" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" Jan 21 15:26:25 crc kubenswrapper[4773]: I0121 15:26:25.205773 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:26:25 crc kubenswrapper[4773]: I0121 15:26:25.206325 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.641485 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ldb72" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.717157 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.717297 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r7cmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-z6mw7_openshift-marketplace(0cfe48f4-51ac-4a11-9dd3-b6087995d9ad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.718457 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-z6mw7" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.737300 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.737474 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ld2fc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fzdjk_openshift-marketplace(9289cbf3-df59-4b6d-890f-d213b42bd96b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.738620 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fzdjk" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.789907 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.790069 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rsflj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-vzncg_openshift-marketplace(3d6badc3-8b6a-4308-84b9-a6a1d6460878): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.791268 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-vzncg" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.796734 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.797584 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xr2sg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-qr2xs_openshift-marketplace(f28ee43e-c39a-4033-a36b-01a987f6c85e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.799472 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-qr2xs" podUID="f28ee43e-c39a-4033-a36b-01a987f6c85e" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.800743 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.800882 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78hmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-wvqn8_openshift-marketplace(46dfa17b-ab20-4d55-933d-2ee7869977c5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 15:26:26 crc kubenswrapper[4773]: E0121 15:26:26.802051 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-wvqn8" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" Jan 21 15:26:26 crc kubenswrapper[4773]: I0121 15:26:26.924299 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 15:26:26 crc kubenswrapper[4773]: W0121 15:26:26.971861 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8314c771_0440_4c78_a512_6a7febec659b.slice/crio-d30d19c975d4a9623cd92ff4c8ab79a6df96888c6ece0f71446b7da702a8cf61 WatchSource:0}: Error finding container d30d19c975d4a9623cd92ff4c8ab79a6df96888c6ece0f71446b7da702a8cf61: Status 404 returned error can't find the container with id d30d19c975d4a9623cd92ff4c8ab79a6df96888c6ece0f71446b7da702a8cf61 Jan 21 15:26:27 crc kubenswrapper[4773]: W0121 15:26:27.198210 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-3f3dac61b70af0ae4354e20d662ea9b63be3da7719f1bc13e0c7c0bfe3eb1995 WatchSource:0}: Error finding container 3f3dac61b70af0ae4354e20d662ea9b63be3da7719f1bc13e0c7c0bfe3eb1995: Status 404 returned error can't find the container with id 3f3dac61b70af0ae4354e20d662ea9b63be3da7719f1bc13e0c7c0bfe3eb1995 Jan 21 15:26:27 crc kubenswrapper[4773]: W0121 15:26:27.207378 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-318f50545b8d4f2dd213d89e0382f352f2a22ddd905e5962c9484a2448f8529a WatchSource:0}: Error finding container 318f50545b8d4f2dd213d89e0382f352f2a22ddd905e5962c9484a2448f8529a: Status 404 returned error can't find the container with id 318f50545b8d4f2dd213d89e0382f352f2a22ddd905e5962c9484a2448f8529a Jan 21 15:26:27 crc kubenswrapper[4773]: I0121 15:26:27.208925 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8n66g" event={"ID":"1a01fed4-2691-453e-b74f-c000d5125b53","Type":"ContainerStarted","Data":"d0bc4b2a5fc895d4e9eda55185d0f83da5f19cca2231dc63412aa1fad51fb640"} Jan 21 15:26:27 crc kubenswrapper[4773]: I0121 15:26:27.211081 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8314c771-0440-4c78-a512-6a7febec659b","Type":"ContainerStarted","Data":"d30d19c975d4a9623cd92ff4c8ab79a6df96888c6ece0f71446b7da702a8cf61"} Jan 21 15:26:27 crc kubenswrapper[4773]: I0121 15:26:27.212636 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3f3dac61b70af0ae4354e20d662ea9b63be3da7719f1bc13e0c7c0bfe3eb1995"} Jan 21 15:26:27 crc kubenswrapper[4773]: E0121 15:26:27.216341 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fzdjk" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" Jan 21 15:26:27 crc kubenswrapper[4773]: E0121 15:26:27.216593 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-qr2xs" podUID="f28ee43e-c39a-4033-a36b-01a987f6c85e" Jan 21 15:26:27 crc kubenswrapper[4773]: E0121 15:26:27.216634 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-z6mw7" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" Jan 21 15:26:27 crc kubenswrapper[4773]: E0121 15:26:27.216716 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-wvqn8" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" Jan 21 15:26:27 crc kubenswrapper[4773]: E0121 15:26:27.216749 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-vzncg" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" Jan 21 15:26:27 crc kubenswrapper[4773]: I0121 15:26:27.222871 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8n66g" podStartSLOduration=128.222852037 podStartE2EDuration="2m8.222852037s" podCreationTimestamp="2026-01-21 15:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:27.222493418 +0000 UTC m=+152.146983050" watchObservedRunningTime="2026-01-21 15:26:27.222852037 +0000 UTC m=+152.147341659" Jan 21 15:26:27 crc kubenswrapper[4773]: I0121 15:26:27.361433 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 15:26:28 crc kubenswrapper[4773]: I0121 15:26:28.220339 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6c55701a9ce2cb7b11348cf0d71dbb4a7024aba85a271969ec4fbd7b7247f10a"} Jan 21 15:26:28 crc kubenswrapper[4773]: I0121 15:26:28.224050 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d7f5e8c900603414297a3756300f3cb82af5e560f777ee573b1d3a8e3f457f5e"} Jan 21 15:26:28 crc kubenswrapper[4773]: I0121 15:26:28.224176 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"318f50545b8d4f2dd213d89e0382f352f2a22ddd905e5962c9484a2448f8529a"} Jan 21 15:26:28 crc kubenswrapper[4773]: I0121 15:26:28.224641 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:26:28 crc kubenswrapper[4773]: I0121 15:26:28.226604 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"847c02626d54513cbdc155897e5da0c41faa7f8260db86ab2a1052f77767bffd"} Jan 21 15:26:28 crc kubenswrapper[4773]: I0121 15:26:28.226723 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ae00020a1dde7786d4af7d2fa5eaa15b3b058ed0f5472198e5e3fcb5e8f61ab7"} Jan 21 15:26:28 crc kubenswrapper[4773]: I0121 15:26:28.228589 4773 generic.go:334] "Generic (PLEG): container finished" podID="8314c771-0440-4c78-a512-6a7febec659b" containerID="aeb23b685d2caddb64402dc9df52aea5f516524f7abbebb705ad7b148b9c9688" exitCode=0 Jan 21 15:26:28 crc kubenswrapper[4773]: I0121 15:26:28.228674 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8314c771-0440-4c78-a512-6a7febec659b","Type":"ContainerDied","Data":"aeb23b685d2caddb64402dc9df52aea5f516524f7abbebb705ad7b148b9c9688"} Jan 21 15:26:28 crc kubenswrapper[4773]: I0121 15:26:28.231219 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71445799-2ffe-4318-8753-3b8801b2db52","Type":"ContainerStarted","Data":"9a8d5673163c05e980969f528f38a786df17c3d299911a36259241f0f13b6ae6"} Jan 21 15:26:28 crc kubenswrapper[4773]: I0121 15:26:28.231272 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71445799-2ffe-4318-8753-3b8801b2db52","Type":"ContainerStarted","Data":"3130123e1bd8de39c8bc74dd4af37c2934627684642d71993f496eaa42f0e764"} Jan 21 15:26:28 crc kubenswrapper[4773]: I0121 15:26:28.291756 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=9.291733095 podStartE2EDuration="9.291733095s" podCreationTimestamp="2026-01-21 15:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:26:28.289864517 +0000 UTC m=+153.214354159" watchObservedRunningTime="2026-01-21 15:26:28.291733095 +0000 UTC m=+153.216222717" Jan 21 15:26:29 crc kubenswrapper[4773]: I0121 15:26:29.502320 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:26:29 crc kubenswrapper[4773]: I0121 15:26:29.621195 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8314c771-0440-4c78-a512-6a7febec659b-kube-api-access\") pod \"8314c771-0440-4c78-a512-6a7febec659b\" (UID: \"8314c771-0440-4c78-a512-6a7febec659b\") " Jan 21 15:26:29 crc kubenswrapper[4773]: I0121 15:26:29.621404 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8314c771-0440-4c78-a512-6a7febec659b-kubelet-dir\") pod \"8314c771-0440-4c78-a512-6a7febec659b\" (UID: \"8314c771-0440-4c78-a512-6a7febec659b\") " Jan 21 15:26:29 crc kubenswrapper[4773]: I0121 15:26:29.621546 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8314c771-0440-4c78-a512-6a7febec659b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8314c771-0440-4c78-a512-6a7febec659b" (UID: "8314c771-0440-4c78-a512-6a7febec659b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:26:29 crc kubenswrapper[4773]: I0121 15:26:29.621912 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8314c771-0440-4c78-a512-6a7febec659b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:29 crc kubenswrapper[4773]: I0121 15:26:29.627317 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8314c771-0440-4c78-a512-6a7febec659b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8314c771-0440-4c78-a512-6a7febec659b" (UID: "8314c771-0440-4c78-a512-6a7febec659b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:29 crc kubenswrapper[4773]: I0121 15:26:29.722993 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8314c771-0440-4c78-a512-6a7febec659b-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:30 crc kubenswrapper[4773]: I0121 15:26:30.244605 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8314c771-0440-4c78-a512-6a7febec659b","Type":"ContainerDied","Data":"d30d19c975d4a9623cd92ff4c8ab79a6df96888c6ece0f71446b7da702a8cf61"} Jan 21 15:26:30 crc kubenswrapper[4773]: I0121 15:26:30.244966 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d30d19c975d4a9623cd92ff4c8ab79a6df96888c6ece0f71446b7da702a8cf61" Jan 21 15:26:30 crc kubenswrapper[4773]: I0121 15:26:30.244635 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 15:26:38 crc kubenswrapper[4773]: I0121 15:26:38.295620 4773 generic.go:334] "Generic (PLEG): container finished" podID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerID="d7ffd3c66f9649d9c2db58403915f6a4f301fc789761f755083317eac2e6497f" exitCode=0 Jan 21 15:26:38 crc kubenswrapper[4773]: I0121 15:26:38.296216 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6ld8" event={"ID":"dbec81d4-f8ff-45fa-b7f8-b29709e477bb","Type":"ContainerDied","Data":"d7ffd3c66f9649d9c2db58403915f6a4f301fc789761f755083317eac2e6497f"} Jan 21 15:26:39 crc kubenswrapper[4773]: I0121 15:26:39.303135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4rrg" event={"ID":"d36b150f-af27-41a9-b699-db2207d44d58","Type":"ContainerStarted","Data":"7c4df6c1bb5e6d8eb40f4535b3304c6e48f13e4410f1bd8606e68765f85f8b0c"} Jan 21 15:26:39 crc kubenswrapper[4773]: I0121 15:26:39.305132 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6ld8" event={"ID":"dbec81d4-f8ff-45fa-b7f8-b29709e477bb","Type":"ContainerStarted","Data":"0160d7cbf13a3062677142508fc3e2cd62b441f9cbfb449c9d8bba99083560b2"} Jan 21 15:26:39 crc kubenswrapper[4773]: I0121 15:26:39.343149 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t6ld8" podStartSLOduration=3.144620767 podStartE2EDuration="1m10.343129824s" podCreationTimestamp="2026-01-21 15:25:29 +0000 UTC" firstStartedPulling="2026-01-21 15:25:31.509905956 +0000 UTC m=+96.434395578" lastFinishedPulling="2026-01-21 15:26:38.708415013 +0000 UTC m=+163.632904635" observedRunningTime="2026-01-21 15:26:39.341281856 +0000 UTC m=+164.265771488" watchObservedRunningTime="2026-01-21 15:26:39.343129824 +0000 UTC m=+164.267619446" Jan 21 15:26:39 crc kubenswrapper[4773]: I0121 15:26:39.689613 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:26:39 crc kubenswrapper[4773]: I0121 15:26:39.689724 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:26:40 crc kubenswrapper[4773]: I0121 15:26:40.311850 4773 generic.go:334] "Generic (PLEG): container finished" podID="d36b150f-af27-41a9-b699-db2207d44d58" containerID="7c4df6c1bb5e6d8eb40f4535b3304c6e48f13e4410f1bd8606e68765f85f8b0c" exitCode=0 Jan 21 15:26:40 crc kubenswrapper[4773]: I0121 15:26:40.311932 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4rrg" event={"ID":"d36b150f-af27-41a9-b699-db2207d44d58","Type":"ContainerDied","Data":"7c4df6c1bb5e6d8eb40f4535b3304c6e48f13e4410f1bd8606e68765f85f8b0c"} Jan 21 15:26:40 crc kubenswrapper[4773]: I0121 15:26:40.742281 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-t6ld8" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerName="registry-server" probeResult="failure" output=< Jan 21 15:26:40 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 15:26:40 crc kubenswrapper[4773]: > Jan 21 15:26:50 crc kubenswrapper[4773]: I0121 15:26:50.012482 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:26:50 crc kubenswrapper[4773]: I0121 15:26:50.223638 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:26:50 crc kubenswrapper[4773]: I0121 15:26:50.265975 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6ld8"] Jan 21 15:26:51 crc kubenswrapper[4773]: I0121 15:26:51.367905 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t6ld8" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerName="registry-server" containerID="cri-o://0160d7cbf13a3062677142508fc3e2cd62b441f9cbfb449c9d8bba99083560b2" gracePeriod=2 Jan 21 15:26:52 crc kubenswrapper[4773]: I0121 15:26:52.377766 4773 generic.go:334] "Generic (PLEG): container finished" podID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerID="0160d7cbf13a3062677142508fc3e2cd62b441f9cbfb449c9d8bba99083560b2" exitCode=0 Jan 21 15:26:52 crc kubenswrapper[4773]: I0121 15:26:52.377833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6ld8" event={"ID":"dbec81d4-f8ff-45fa-b7f8-b29709e477bb","Type":"ContainerDied","Data":"0160d7cbf13a3062677142508fc3e2cd62b441f9cbfb449c9d8bba99083560b2"} Jan 21 15:26:55 crc kubenswrapper[4773]: I0121 15:26:55.206063 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:26:55 crc kubenswrapper[4773]: I0121 15:26:55.206432 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:26:56 crc kubenswrapper[4773]: I0121 15:26:56.926081 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.102125 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh8hg\" (UniqueName: \"kubernetes.io/projected/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-kube-api-access-nh8hg\") pod \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.102267 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-utilities\") pod \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.103220 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-utilities" (OuterVolumeSpecName: "utilities") pod "dbec81d4-f8ff-45fa-b7f8-b29709e477bb" (UID: "dbec81d4-f8ff-45fa-b7f8-b29709e477bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.103431 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-catalog-content\") pod \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\" (UID: \"dbec81d4-f8ff-45fa-b7f8-b29709e477bb\") " Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.104300 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.108276 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-kube-api-access-nh8hg" (OuterVolumeSpecName: "kube-api-access-nh8hg") pod "dbec81d4-f8ff-45fa-b7f8-b29709e477bb" (UID: "dbec81d4-f8ff-45fa-b7f8-b29709e477bb"). InnerVolumeSpecName "kube-api-access-nh8hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.150553 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbec81d4-f8ff-45fa-b7f8-b29709e477bb" (UID: "dbec81d4-f8ff-45fa-b7f8-b29709e477bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.205258 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.205298 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh8hg\" (UniqueName: \"kubernetes.io/projected/dbec81d4-f8ff-45fa-b7f8-b29709e477bb-kube-api-access-nh8hg\") on node \"crc\" DevicePath \"\"" Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.406057 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzdjk" event={"ID":"9289cbf3-df59-4b6d-890f-d213b42bd96b","Type":"ContainerStarted","Data":"c2969a28cec03da3b079363ccda7ded0b3d5fef3e3d3e3ba8aa2c766f1867399"} Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.408594 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4rrg" event={"ID":"d36b150f-af27-41a9-b699-db2207d44d58","Type":"ContainerStarted","Data":"dcaac8f82a51952581bd3447fde7d0a516825e2a2c10803ac3a86db0330d1e87"} Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.421229 4773 generic.go:334] "Generic (PLEG): container finished" podID="f28ee43e-c39a-4033-a36b-01a987f6c85e" containerID="188048fece5cbdcd990847c3b1411b6cf76829f3f1919b2fe64a12b74395489f" exitCode=0 Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.421379 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qr2xs" event={"ID":"f28ee43e-c39a-4033-a36b-01a987f6c85e","Type":"ContainerDied","Data":"188048fece5cbdcd990847c3b1411b6cf76829f3f1919b2fe64a12b74395489f"} Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.440010 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvqn8" event={"ID":"46dfa17b-ab20-4d55-933d-2ee7869977c5","Type":"ContainerStarted","Data":"d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d"} Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.449820 4773 generic.go:334] "Generic (PLEG): container finished" podID="378f5d0d-bcef-44be-b03a-b29b3ea33329" containerID="dfa20da59eb5a4cdbeb2d688cbfb842ff60040d706ecb6dfd1ec56cdeca51fab" exitCode=0 Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.449900 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ldb72" event={"ID":"378f5d0d-bcef-44be-b03a-b29b3ea33329","Type":"ContainerDied","Data":"dfa20da59eb5a4cdbeb2d688cbfb842ff60040d706ecb6dfd1ec56cdeca51fab"} Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.454108 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6ld8" Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.454860 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6ld8" event={"ID":"dbec81d4-f8ff-45fa-b7f8-b29709e477bb","Type":"ContainerDied","Data":"8bebf2c7d597e7fc57e68d0ea813a13094364c93ed1f82872a127e56da6016d4"} Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.454952 4773 scope.go:117] "RemoveContainer" containerID="0160d7cbf13a3062677142508fc3e2cd62b441f9cbfb449c9d8bba99083560b2" Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.459114 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzncg" event={"ID":"3d6badc3-8b6a-4308-84b9-a6a1d6460878","Type":"ContainerStarted","Data":"82161a34ced2fe08152014a9cdaeb4ac87a3c027765d8cd917ec48d0b2186703"} Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.463321 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mw7" event={"ID":"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad","Type":"ContainerStarted","Data":"4dc7d8dca292844cd2a2e01858a573bd4e61d74b78be9adf1c7420e72e4947ee"} Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.507981 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x4rrg" podStartSLOduration=4.284040223 podStartE2EDuration="1m29.507962704s" podCreationTimestamp="2026-01-21 15:25:28 +0000 UTC" firstStartedPulling="2026-01-21 15:25:31.515110601 +0000 UTC m=+96.439600223" lastFinishedPulling="2026-01-21 15:26:56.739033082 +0000 UTC m=+181.663522704" observedRunningTime="2026-01-21 15:26:57.48908524 +0000 UTC m=+182.413574862" watchObservedRunningTime="2026-01-21 15:26:57.507962704 +0000 UTC m=+182.432452326" Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.601644 4773 scope.go:117] "RemoveContainer" containerID="d7ffd3c66f9649d9c2db58403915f6a4f301fc789761f755083317eac2e6497f" Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.604100 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6ld8"] Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.609037 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t6ld8"] Jan 21 15:26:57 crc kubenswrapper[4773]: I0121 15:26:57.625352 4773 scope.go:117] "RemoveContainer" containerID="d536bb88e9cc0c29635c99980a9a90544f09b3ac8da4b899b464264d56a9ab79" Jan 21 15:26:58 crc kubenswrapper[4773]: I0121 15:26:58.472914 4773 generic.go:334] "Generic (PLEG): container finished" podID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerID="4dc7d8dca292844cd2a2e01858a573bd4e61d74b78be9adf1c7420e72e4947ee" exitCode=0 Jan 21 15:26:58 crc kubenswrapper[4773]: I0121 15:26:58.472998 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mw7" event={"ID":"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad","Type":"ContainerDied","Data":"4dc7d8dca292844cd2a2e01858a573bd4e61d74b78be9adf1c7420e72e4947ee"} Jan 21 15:26:58 crc kubenswrapper[4773]: I0121 15:26:58.474609 4773 generic.go:334] "Generic (PLEG): container finished" podID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerID="c2969a28cec03da3b079363ccda7ded0b3d5fef3e3d3e3ba8aa2c766f1867399" exitCode=0 Jan 21 15:26:58 crc kubenswrapper[4773]: I0121 15:26:58.474667 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzdjk" event={"ID":"9289cbf3-df59-4b6d-890f-d213b42bd96b","Type":"ContainerDied","Data":"c2969a28cec03da3b079363ccda7ded0b3d5fef3e3d3e3ba8aa2c766f1867399"} Jan 21 15:26:58 crc kubenswrapper[4773]: I0121 15:26:58.482267 4773 generic.go:334] "Generic (PLEG): container finished" podID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerID="d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d" exitCode=0 Jan 21 15:26:58 crc kubenswrapper[4773]: I0121 15:26:58.482344 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvqn8" event={"ID":"46dfa17b-ab20-4d55-933d-2ee7869977c5","Type":"ContainerDied","Data":"d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d"} Jan 21 15:26:58 crc kubenswrapper[4773]: I0121 15:26:58.497312 4773 generic.go:334] "Generic (PLEG): container finished" podID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" containerID="82161a34ced2fe08152014a9cdaeb4ac87a3c027765d8cd917ec48d0b2186703" exitCode=0 Jan 21 15:26:58 crc kubenswrapper[4773]: I0121 15:26:58.497372 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzncg" event={"ID":"3d6badc3-8b6a-4308-84b9-a6a1d6460878","Type":"ContainerDied","Data":"82161a34ced2fe08152014a9cdaeb4ac87a3c027765d8cd917ec48d0b2186703"} Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.186644 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.187193 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.252867 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.390615 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" path="/var/lib/kubelet/pods/dbec81d4-f8ff-45fa-b7f8-b29709e477bb/volumes" Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.517192 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvqn8" event={"ID":"46dfa17b-ab20-4d55-933d-2ee7869977c5","Type":"ContainerStarted","Data":"e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266"} Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.520263 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ldb72" event={"ID":"378f5d0d-bcef-44be-b03a-b29b3ea33329","Type":"ContainerStarted","Data":"73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5"} Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.522292 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzncg" event={"ID":"3d6badc3-8b6a-4308-84b9-a6a1d6460878","Type":"ContainerStarted","Data":"097525dd10682aeca3047dca7b54c9fb439f55fd91ecfccade5af1c0d5bba740"} Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.525434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mw7" event={"ID":"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad","Type":"ContainerStarted","Data":"f8c006a805e23dee902281718a6f4a5aef1e55c4b4a3c290340eefefe35ebf51"} Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.528574 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzdjk" event={"ID":"9289cbf3-df59-4b6d-890f-d213b42bd96b","Type":"ContainerStarted","Data":"04090fcad257a928f2292803325e4773eaae1c9ab1b3f26f947e863da21bbd31"} Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.532333 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qr2xs" event={"ID":"f28ee43e-c39a-4033-a36b-01a987f6c85e","Type":"ContainerStarted","Data":"0c83ca894a5ce65db4656e60e29cd02919a83380cfc445f2497d6d6e118555d8"} Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.551408 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.551490 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.564593 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fzdjk" podStartSLOduration=3.177459524 podStartE2EDuration="1m30.564572916s" podCreationTimestamp="2026-01-21 15:25:29 +0000 UTC" firstStartedPulling="2026-01-21 15:25:31.519266829 +0000 UTC m=+96.443756451" lastFinishedPulling="2026-01-21 15:26:58.906380221 +0000 UTC m=+183.830869843" observedRunningTime="2026-01-21 15:26:59.564546735 +0000 UTC m=+184.489036357" watchObservedRunningTime="2026-01-21 15:26:59.564572916 +0000 UTC m=+184.489062538" Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.568279 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wvqn8" podStartSLOduration=3.253339681 podStartE2EDuration="1m27.568259923s" podCreationTimestamp="2026-01-21 15:25:32 +0000 UTC" firstStartedPulling="2026-01-21 15:25:34.665362662 +0000 UTC m=+99.589852284" lastFinishedPulling="2026-01-21 15:26:58.980282904 +0000 UTC m=+183.904772526" observedRunningTime="2026-01-21 15:26:59.541654037 +0000 UTC m=+184.466143659" watchObservedRunningTime="2026-01-21 15:26:59.568259923 +0000 UTC m=+184.492749545" Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.586556 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vzncg" podStartSLOduration=4.770129036 podStartE2EDuration="1m31.586535931s" podCreationTimestamp="2026-01-21 15:25:28 +0000 UTC" firstStartedPulling="2026-01-21 15:25:31.512380909 +0000 UTC m=+96.436870541" lastFinishedPulling="2026-01-21 15:26:58.328787814 +0000 UTC m=+183.253277436" observedRunningTime="2026-01-21 15:26:59.584163588 +0000 UTC m=+184.508653230" watchObservedRunningTime="2026-01-21 15:26:59.586535931 +0000 UTC m=+184.511025553" Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.608183 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ldb72" podStartSLOduration=3.862805345 podStartE2EDuration="1m28.608167596s" podCreationTimestamp="2026-01-21 15:25:31 +0000 UTC" firstStartedPulling="2026-01-21 15:25:33.642941639 +0000 UTC m=+98.567431261" lastFinishedPulling="2026-01-21 15:26:58.38830389 +0000 UTC m=+183.312793512" observedRunningTime="2026-01-21 15:26:59.600957108 +0000 UTC m=+184.525446740" watchObservedRunningTime="2026-01-21 15:26:59.608167596 +0000 UTC m=+184.532657218" Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.623017 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z6mw7" podStartSLOduration=3.401690693 podStartE2EDuration="1m28.622994864s" podCreationTimestamp="2026-01-21 15:25:31 +0000 UTC" firstStartedPulling="2026-01-21 15:25:33.658587197 +0000 UTC m=+98.583076819" lastFinishedPulling="2026-01-21 15:26:58.879891368 +0000 UTC m=+183.804380990" observedRunningTime="2026-01-21 15:26:59.622527352 +0000 UTC m=+184.547016984" watchObservedRunningTime="2026-01-21 15:26:59.622994864 +0000 UTC m=+184.547484486" Jan 21 15:26:59 crc kubenswrapper[4773]: I0121 15:26:59.645534 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qr2xs" podStartSLOduration=3.947381612 podStartE2EDuration="1m28.645514203s" podCreationTimestamp="2026-01-21 15:25:31 +0000 UTC" firstStartedPulling="2026-01-21 15:25:33.599376944 +0000 UTC m=+98.523866556" lastFinishedPulling="2026-01-21 15:26:58.297509525 +0000 UTC m=+183.221999147" observedRunningTime="2026-01-21 15:26:59.643610274 +0000 UTC m=+184.568099916" watchObservedRunningTime="2026-01-21 15:26:59.645514203 +0000 UTC m=+184.570003835" Jan 21 15:27:00 crc kubenswrapper[4773]: I0121 15:27:00.293772 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lt258"] Jan 21 15:27:00 crc kubenswrapper[4773]: I0121 15:27:00.596561 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fzdjk" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerName="registry-server" probeResult="failure" output=< Jan 21 15:27:00 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 15:27:00 crc kubenswrapper[4773]: > Jan 21 15:27:01 crc kubenswrapper[4773]: I0121 15:27:01.495164 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 15:27:01 crc kubenswrapper[4773]: I0121 15:27:01.580105 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:27:01 crc kubenswrapper[4773]: I0121 15:27:01.580357 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:27:01 crc kubenswrapper[4773]: I0121 15:27:01.629074 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:27:01 crc kubenswrapper[4773]: I0121 15:27:01.859763 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:27:01 crc kubenswrapper[4773]: I0121 15:27:01.859826 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:27:01 crc kubenswrapper[4773]: I0121 15:27:01.898815 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:27:02 crc kubenswrapper[4773]: I0121 15:27:02.284476 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:27:02 crc kubenswrapper[4773]: I0121 15:27:02.284532 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:27:02 crc kubenswrapper[4773]: I0121 15:27:02.560676 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:27:02 crc kubenswrapper[4773]: I0121 15:27:02.560755 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:27:03 crc kubenswrapper[4773]: I0121 15:27:03.318264 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z6mw7" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerName="registry-server" probeResult="failure" output=< Jan 21 15:27:03 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 15:27:03 crc kubenswrapper[4773]: > Jan 21 15:27:03 crc kubenswrapper[4773]: I0121 15:27:03.597101 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:27:03 crc kubenswrapper[4773]: I0121 15:27:03.625593 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wvqn8" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerName="registry-server" probeResult="failure" output=< Jan 21 15:27:03 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 15:27:03 crc kubenswrapper[4773]: > Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.714791 4773 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 15:27:05 crc kubenswrapper[4773]: E0121 15:27:05.715194 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerName="registry-server" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.715216 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerName="registry-server" Jan 21 15:27:05 crc kubenswrapper[4773]: E0121 15:27:05.715229 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8314c771-0440-4c78-a512-6a7febec659b" containerName="pruner" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.715236 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8314c771-0440-4c78-a512-6a7febec659b" containerName="pruner" Jan 21 15:27:05 crc kubenswrapper[4773]: E0121 15:27:05.715251 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerName="extract-content" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.715260 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerName="extract-content" Jan 21 15:27:05 crc kubenswrapper[4773]: E0121 15:27:05.715273 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerName="extract-utilities" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.715280 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerName="extract-utilities" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.715398 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbec81d4-f8ff-45fa-b7f8-b29709e477bb" containerName="registry-server" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.715419 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8314c771-0440-4c78-a512-6a7febec659b" containerName="pruner" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.715845 4773 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.716101 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb" gracePeriod=15 Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.716222 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085" gracePeriod=15 Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.716234 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19" gracePeriod=15 Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.716294 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b" gracePeriod=15 Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.716294 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138" gracePeriod=15 Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.716556 4773 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.716207 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: E0121 15:27:05.717827 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.717847 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 15:27:05 crc kubenswrapper[4773]: E0121 15:27:05.717893 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.717904 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 15:27:05 crc kubenswrapper[4773]: E0121 15:27:05.717919 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.717927 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 15:27:05 crc kubenswrapper[4773]: E0121 15:27:05.717935 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.717942 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:27:05 crc kubenswrapper[4773]: E0121 15:27:05.718085 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.718097 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 15:27:05 crc kubenswrapper[4773]: E0121 15:27:05.718109 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.718115 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:27:05 crc kubenswrapper[4773]: E0121 15:27:05.718125 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.718132 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.718286 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.718360 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.718371 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.718382 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.718392 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.718434 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.721797 4773 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.761842 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.815765 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.815824 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.815856 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.815896 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.815956 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.816011 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.816034 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.816050 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917208 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917267 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917289 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917316 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917337 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917355 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917375 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917390 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917457 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917502 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917613 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917626 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917753 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917800 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:05 crc kubenswrapper[4773]: I0121 15:27:05.917839 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.059159 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:27:06 crc kubenswrapper[4773]: W0121 15:27:06.077880 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6f24d9bb0bb71d1df5135f6f4afd134beeffcc6fac6fbcaf97e731b8465da47f WatchSource:0}: Error finding container 6f24d9bb0bb71d1df5135f6f4afd134beeffcc6fac6fbcaf97e731b8465da47f: Status 404 returned error can't find the container with id 6f24d9bb0bb71d1df5135f6f4afd134beeffcc6fac6fbcaf97e731b8465da47f Jan 21 15:27:06 crc kubenswrapper[4773]: E0121 15:27:06.081707 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.99:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188cc884bceefe43 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 15:27:06.080779843 +0000 UTC m=+191.005269465,LastTimestamp:2026-01-21 15:27:06.080779843 +0000 UTC m=+191.005269465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.572017 4773 generic.go:334] "Generic (PLEG): container finished" podID="71445799-2ffe-4318-8753-3b8801b2db52" containerID="9a8d5673163c05e980969f528f38a786df17c3d299911a36259241f0f13b6ae6" exitCode=0 Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.572128 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71445799-2ffe-4318-8753-3b8801b2db52","Type":"ContainerDied","Data":"9a8d5673163c05e980969f528f38a786df17c3d299911a36259241f0f13b6ae6"} Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.573784 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.574392 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.575100 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700"} Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.575206 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6f24d9bb0bb71d1df5135f6f4afd134beeffcc6fac6fbcaf97e731b8465da47f"} Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.578993 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.580488 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.581328 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138" exitCode=0 Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.581388 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b" exitCode=0 Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.581404 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19" exitCode=0 Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.581420 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085" exitCode=2 Jan 21 15:27:06 crc kubenswrapper[4773]: I0121 15:27:06.581447 4773 scope.go:117] "RemoveContainer" containerID="37351b7b046b89f53090e65e44b310c0b1d4a78141dfbf038e13404405b53837" Jan 21 15:27:07 crc kubenswrapper[4773]: I0121 15:27:07.591861 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 15:27:07 crc kubenswrapper[4773]: I0121 15:27:07.594110 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:07 crc kubenswrapper[4773]: I0121 15:27:07.594497 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:07 crc kubenswrapper[4773]: I0121 15:27:07.878564 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:27:07 crc kubenswrapper[4773]: I0121 15:27:07.879722 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:07 crc kubenswrapper[4773]: I0121 15:27:07.879983 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.044150 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-kubelet-dir\") pod \"71445799-2ffe-4318-8753-3b8801b2db52\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.044224 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-var-lock\") pod \"71445799-2ffe-4318-8753-3b8801b2db52\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.044328 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71445799-2ffe-4318-8753-3b8801b2db52-kube-api-access\") pod \"71445799-2ffe-4318-8753-3b8801b2db52\" (UID: \"71445799-2ffe-4318-8753-3b8801b2db52\") " Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.045222 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "71445799-2ffe-4318-8753-3b8801b2db52" (UID: "71445799-2ffe-4318-8753-3b8801b2db52"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.045295 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-var-lock" (OuterVolumeSpecName: "var-lock") pod "71445799-2ffe-4318-8753-3b8801b2db52" (UID: "71445799-2ffe-4318-8753-3b8801b2db52"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.050645 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71445799-2ffe-4318-8753-3b8801b2db52-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "71445799-2ffe-4318-8753-3b8801b2db52" (UID: "71445799-2ffe-4318-8753-3b8801b2db52"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.145778 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71445799-2ffe-4318-8753-3b8801b2db52-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.146111 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.146120 4773 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71445799-2ffe-4318-8753-3b8801b2db52-var-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.600558 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71445799-2ffe-4318-8753-3b8801b2db52","Type":"ContainerDied","Data":"3130123e1bd8de39c8bc74dd4af37c2934627684642d71993f496eaa42f0e764"} Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.600604 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.600614 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3130123e1bd8de39c8bc74dd4af37c2934627684642d71993f496eaa42f0e764" Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.613533 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.616173 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb" exitCode=0 Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.616715 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:08 crc kubenswrapper[4773]: I0121 15:27:08.617106 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: E0121 15:27:09.162549 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: E0121 15:27:09.162964 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: E0121 15:27:09.163278 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: E0121 15:27:09.163623 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: E0121 15:27:09.163902 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.163937 4773 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 21 15:27:09 crc kubenswrapper[4773]: E0121 15:27:09.164293 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" interval="200ms" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.233147 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.233543 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.233816 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.234081 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.365078 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.365123 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:27:09 crc kubenswrapper[4773]: E0121 15:27:09.365246 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" interval="400ms" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.401864 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.403025 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.404512 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.404806 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.405028 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.405897 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.410349 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.411062 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.411517 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.411874 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.412218 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.412529 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.564156 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.564254 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.564275 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.564351 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.564427 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.564492 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.565637 4773 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.565866 4773 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.565883 4773 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.594620 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.595320 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.596025 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.596360 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.596652 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.596873 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.597065 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.623761 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.624538 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.624603 4773 scope.go:117] "RemoveContainer" containerID="de09c362256d2aa9a7f6bde44f35a492b52f7a3bfb06bd8919b03ecde6661138" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.638894 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.639752 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.640158 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.640284 4773 scope.go:117] "RemoveContainer" containerID="a7f50cd6169073a50bf9990b49bef9026278904be5fbd3cd247a0f9fff97e75b" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.640384 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.640559 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.640802 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.641044 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.641298 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.641582 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.642001 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.642257 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.642554 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.642882 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.652488 4773 scope.go:117] "RemoveContainer" containerID="82207948fdcbf4f1db1e788199a770bf74d223587b3e575b3a2cdf8c8d26fb19" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.665381 4773 scope.go:117] "RemoveContainer" containerID="a9f322ec2451a9e5be35afcf00905747bbfa757f45e3c09f2cb2a95d08a65085" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.666925 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.667489 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.667897 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.668160 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.668434 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.668643 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.668913 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.679896 4773 scope.go:117] "RemoveContainer" containerID="af515a5cbc4e986d5135f3087170d6f06f71bf20ba9b4eac5018235dc012a3cb" Jan 21 15:27:09 crc kubenswrapper[4773]: I0121 15:27:09.694366 4773 scope.go:117] "RemoveContainer" containerID="ddcae9f4f599585e4ade81f655526a4afedf1b2187914d0f29a5794fe4e99bfc" Jan 21 15:27:09 crc kubenswrapper[4773]: E0121 15:27:09.766873 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" interval="800ms" Jan 21 15:27:10 crc kubenswrapper[4773]: E0121 15:27:10.375153 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:27:10Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:27:10Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:27:10Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:27:10Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:10 crc kubenswrapper[4773]: E0121 15:27:10.376540 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:10 crc kubenswrapper[4773]: E0121 15:27:10.376881 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:10 crc kubenswrapper[4773]: E0121 15:27:10.377193 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:10 crc kubenswrapper[4773]: E0121 15:27:10.377517 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:10 crc kubenswrapper[4773]: E0121 15:27:10.377544 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:27:10 crc kubenswrapper[4773]: E0121 15:27:10.568030 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" interval="1.6s" Jan 21 15:27:11 crc kubenswrapper[4773]: I0121 15:27:11.390063 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 21 15:27:11 crc kubenswrapper[4773]: E0121 15:27:11.854847 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.99:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188cc884bceefe43 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 15:27:06.080779843 +0000 UTC m=+191.005269465,LastTimestamp:2026-01-21 15:27:06.080779843 +0000 UTC m=+191.005269465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 15:27:11 crc kubenswrapper[4773]: I0121 15:27:11.892571 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:27:11 crc kubenswrapper[4773]: I0121 15:27:11.893248 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:11 crc kubenswrapper[4773]: I0121 15:27:11.893631 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:11 crc kubenswrapper[4773]: I0121 15:27:11.893958 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:11 crc kubenswrapper[4773]: I0121 15:27:11.894226 4773 status_manager.go:851] "Failed to get status for pod" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" pod="openshift-marketplace/redhat-marketplace-ldb72" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ldb72\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:11 crc kubenswrapper[4773]: I0121 15:27:11.894522 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:11 crc kubenswrapper[4773]: I0121 15:27:11.894770 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: E0121 15:27:12.169852 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" interval="3.2s" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.323302 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.324139 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.324589 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.325291 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.325607 4773 status_manager.go:851] "Failed to get status for pod" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" pod="openshift-marketplace/redhat-marketplace-ldb72" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ldb72\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.326021 4773 status_manager.go:851] "Failed to get status for pod" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" pod="openshift-marketplace/redhat-operators-z6mw7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-z6mw7\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.326380 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.326669 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.362093 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.363600 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.364482 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.365213 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.365767 4773 status_manager.go:851] "Failed to get status for pod" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" pod="openshift-marketplace/redhat-marketplace-ldb72" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ldb72\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.366106 4773 status_manager.go:851] "Failed to get status for pod" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" pod="openshift-marketplace/redhat-operators-z6mw7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-z6mw7\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.366452 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.366782 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.603561 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.604416 4773 status_manager.go:851] "Failed to get status for pod" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" pod="openshift-marketplace/redhat-operators-z6mw7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-z6mw7\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.605034 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.605520 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.606015 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.606317 4773 status_manager.go:851] "Failed to get status for pod" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" pod="openshift-marketplace/redhat-operators-wvqn8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wvqn8\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.606639 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.606986 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.607274 4773 status_manager.go:851] "Failed to get status for pod" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" pod="openshift-marketplace/redhat-marketplace-ldb72" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ldb72\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.639533 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.640198 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.640653 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.641000 4773 status_manager.go:851] "Failed to get status for pod" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" pod="openshift-marketplace/redhat-marketplace-ldb72" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ldb72\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.641365 4773 status_manager.go:851] "Failed to get status for pod" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" pod="openshift-marketplace/redhat-operators-z6mw7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-z6mw7\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.641710 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.642091 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.642522 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:12 crc kubenswrapper[4773]: I0121 15:27:12.643025 4773 status_manager.go:851] "Failed to get status for pod" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" pod="openshift-marketplace/redhat-operators-wvqn8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wvqn8\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:15 crc kubenswrapper[4773]: E0121 15:27:15.370965 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" interval="6.4s" Jan 21 15:27:15 crc kubenswrapper[4773]: I0121 15:27:15.386602 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:15 crc kubenswrapper[4773]: I0121 15:27:15.387259 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:15 crc kubenswrapper[4773]: I0121 15:27:15.387861 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:15 crc kubenswrapper[4773]: I0121 15:27:15.388938 4773 status_manager.go:851] "Failed to get status for pod" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" pod="openshift-marketplace/redhat-operators-wvqn8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wvqn8\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:15 crc kubenswrapper[4773]: I0121 15:27:15.389368 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:15 crc kubenswrapper[4773]: I0121 15:27:15.389635 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:15 crc kubenswrapper[4773]: I0121 15:27:15.390149 4773 status_manager.go:851] "Failed to get status for pod" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" pod="openshift-marketplace/redhat-marketplace-ldb72" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ldb72\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:15 crc kubenswrapper[4773]: I0121 15:27:15.390928 4773 status_manager.go:851] "Failed to get status for pod" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" pod="openshift-marketplace/redhat-operators-z6mw7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-z6mw7\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:16 crc kubenswrapper[4773]: E0121 15:27:16.465003 4773 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.129.56.99:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" volumeName="registry-storage" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.383211 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.384500 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.385257 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.385515 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.385747 4773 status_manager.go:851] "Failed to get status for pod" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" pod="openshift-marketplace/redhat-operators-wvqn8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wvqn8\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.386012 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.386386 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.386978 4773 status_manager.go:851] "Failed to get status for pod" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" pod="openshift-marketplace/redhat-marketplace-ldb72" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ldb72\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.387305 4773 status_manager.go:851] "Failed to get status for pod" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" pod="openshift-marketplace/redhat-operators-z6mw7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-z6mw7\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.397837 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.397881 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:18 crc kubenswrapper[4773]: E0121 15:27:18.398412 4773 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.399160 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:18 crc kubenswrapper[4773]: W0121 15:27:18.426055 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-b61a9601f5c4f35e99bed2a3d639be6a1a784a3d7f39bf4aa15115a010dded79 WatchSource:0}: Error finding container b61a9601f5c4f35e99bed2a3d639be6a1a784a3d7f39bf4aa15115a010dded79: Status 404 returned error can't find the container with id b61a9601f5c4f35e99bed2a3d639be6a1a784a3d7f39bf4aa15115a010dded79 Jan 21 15:27:18 crc kubenswrapper[4773]: I0121 15:27:18.677044 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b61a9601f5c4f35e99bed2a3d639be6a1a784a3d7f39bf4aa15115a010dded79"} Jan 21 15:27:19 crc kubenswrapper[4773]: I0121 15:27:19.993009 4773 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 21 15:27:19 crc kubenswrapper[4773]: I0121 15:27:19.993099 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 21 15:27:20 crc kubenswrapper[4773]: E0121 15:27:20.745539 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:27:20Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:27:20Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:27:20Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T15:27:20Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:20 crc kubenswrapper[4773]: E0121 15:27:20.746065 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:20 crc kubenswrapper[4773]: E0121 15:27:20.746486 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:20 crc kubenswrapper[4773]: E0121 15:27:20.746737 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:20 crc kubenswrapper[4773]: E0121 15:27:20.747130 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:20 crc kubenswrapper[4773]: E0121 15:27:20.747175 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 15:27:21 crc kubenswrapper[4773]: E0121 15:27:21.772403 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.99:6443: connect: connection refused" interval="7s" Jan 21 15:27:21 crc kubenswrapper[4773]: E0121 15:27:21.856832 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.99:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188cc884bceefe43 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 15:27:06.080779843 +0000 UTC m=+191.005269465,LastTimestamp:2026-01-21 15:27:06.080779843 +0000 UTC m=+191.005269465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.712499 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.712850 4773 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a" exitCode=1 Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.712963 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a"} Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.713749 4773 scope.go:117] "RemoveContainer" containerID="e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a" Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.713859 4773 status_manager.go:851] "Failed to get status for pod" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" pod="openshift-marketplace/redhat-operators-wvqn8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wvqn8\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.714356 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.714420 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aec398faaf756b574d13a4246078b127b1c8fd2ca695b7a4bcd4d3798486a2df"} Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.714867 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.715487 4773 status_manager.go:851] "Failed to get status for pod" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" pod="openshift-marketplace/redhat-marketplace-ldb72" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ldb72\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.716180 4773 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.716487 4773 status_manager.go:851] "Failed to get status for pod" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" pod="openshift-marketplace/redhat-operators-z6mw7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-z6mw7\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.717385 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.717716 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:23 crc kubenswrapper[4773]: I0121 15:27:23.718008 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.334852 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.721671 4773 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="aec398faaf756b574d13a4246078b127b1c8fd2ca695b7a4bcd4d3798486a2df" exitCode=0 Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.721754 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"aec398faaf756b574d13a4246078b127b1c8fd2ca695b7a4bcd4d3798486a2df"} Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.722033 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.722054 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:24 crc kubenswrapper[4773]: E0121 15:27:24.722497 4773 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.722499 4773 status_manager.go:851] "Failed to get status for pod" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" pod="openshift-marketplace/redhat-operators-wvqn8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wvqn8\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.722802 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.723859 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.724717 4773 status_manager.go:851] "Failed to get status for pod" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" pod="openshift-marketplace/redhat-marketplace-ldb72" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ldb72\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.725031 4773 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.725504 4773 status_manager.go:851] "Failed to get status for pod" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" pod="openshift-marketplace/redhat-operators-z6mw7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-z6mw7\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.725649 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.725736 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4d1eb5666e49486c3f4eb94fc5433702d0d7a2ca2bd72e231c235b7e2964ebfa"} Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.725942 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.726203 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.726481 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.726923 4773 status_manager.go:851] "Failed to get status for pod" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" pod="openshift-marketplace/redhat-operators-wvqn8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wvqn8\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.727176 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.727454 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.727769 4773 status_manager.go:851] "Failed to get status for pod" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" pod="openshift-marketplace/redhat-marketplace-ldb72" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ldb72\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.728068 4773 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.728392 4773 status_manager.go:851] "Failed to get status for pod" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" pod="openshift-marketplace/redhat-operators-z6mw7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-z6mw7\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.728864 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.729291 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:24 crc kubenswrapper[4773]: I0121 15:27:24.729557 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.206178 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.206767 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.206900 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.207538 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.207777 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b" gracePeriod=600 Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.328598 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" podUID="3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" containerName="oauth-openshift" containerID="cri-o://4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153" gracePeriod=15 Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.392974 4773 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.393409 4773 status_manager.go:851] "Failed to get status for pod" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" pod="openshift-marketplace/certified-operators-vzncg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-vzncg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.393850 4773 status_manager.go:851] "Failed to get status for pod" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" pod="openshift-marketplace/redhat-marketplace-ldb72" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-ldb72\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.394173 4773 status_manager.go:851] "Failed to get status for pod" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" pod="openshift-marketplace/redhat-operators-z6mw7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-z6mw7\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.394544 4773 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.394813 4773 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.395081 4773 status_manager.go:851] "Failed to get status for pod" podUID="71445799-2ffe-4318-8753-3b8801b2db52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.395433 4773 status_manager.go:851] "Failed to get status for pod" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" pod="openshift-marketplace/community-operators-fzdjk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-fzdjk\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.396149 4773 status_manager.go:851] "Failed to get status for pod" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" pod="openshift-marketplace/redhat-operators-wvqn8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wvqn8\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.396626 4773 status_manager.go:851] "Failed to get status for pod" podUID="d36b150f-af27-41a9-b699-db2207d44d58" pod="openshift-marketplace/community-operators-x4rrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-x4rrg\": dial tcp 38.129.56.99:6443: connect: connection refused" Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.737637 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c0c17a3ba23c37a30c55a5349115064c124b2f3156f770e75b4bcbd3f1993075"} Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.740753 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b" exitCode=0 Jan 21 15:27:25 crc kubenswrapper[4773]: I0121 15:27:25.740845 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b"} Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.222866 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.385215 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-trusted-ca-bundle\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.385275 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-dir\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.385315 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-login\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.385347 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-service-ca\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.385374 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjpdz\" (UniqueName: \"kubernetes.io/projected/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-kube-api-access-zjpdz\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.385388 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.386560 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.386774 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-cliconfig\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.386818 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-error\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.386829 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.386860 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-idp-0-file-data\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.386900 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-serving-cert\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.386926 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-session\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.386948 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-provider-selection\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.386998 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-ocp-branding-template\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.387034 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-policies\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.387063 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-router-certs\") pod \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\" (UID: \"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b\") " Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.387347 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.387403 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.387417 4773 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.390026 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.390558 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.394962 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.398120 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.398515 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.401388 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.401553 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.402124 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-kube-api-access-zjpdz" (OuterVolumeSpecName: "kube-api-access-zjpdz") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "kube-api-access-zjpdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.402868 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.402505 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.408388 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" (UID: "3823c3bf-ed3b-4ad7-a537-99c12d14bc4b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.488880 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.488937 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjpdz\" (UniqueName: \"kubernetes.io/projected/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-kube-api-access-zjpdz\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.488953 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.488975 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.488991 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.489005 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.489020 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.489040 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.489157 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.489178 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.489193 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.751312 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" event={"ID":"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b","Type":"ContainerDied","Data":"4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153"} Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.751325 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.751400 4773 scope.go:117] "RemoveContainer" containerID="4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.751240 4773 generic.go:334] "Generic (PLEG): container finished" podID="3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" containerID="4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153" exitCode=0 Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.751638 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lt258" event={"ID":"3823c3bf-ed3b-4ad7-a537-99c12d14bc4b","Type":"ContainerDied","Data":"90c26e239a14c57fac3f660766e4bce2314f68159b916ddb48adf374764aaef6"} Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.777497 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"f994c8b8ffd7d3d170163ecb5d02f4d4eede87eb14206bde3e120b03784b936f"} Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.781994 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"02c503480610ce13f40a1a9bbe54f306d982412308fed9ced2414a2a1006658e"} Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.782484 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d5a9d84193426078976f9dd2728384efbe406c7d5fe7fa57716e9900436ca0af"} Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.794139 4773 scope.go:117] "RemoveContainer" containerID="4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153" Jan 21 15:27:26 crc kubenswrapper[4773]: E0121 15:27:26.794909 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153\": container with ID starting with 4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153 not found: ID does not exist" containerID="4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153" Jan 21 15:27:26 crc kubenswrapper[4773]: I0121 15:27:26.795215 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153"} err="failed to get container status \"4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153\": rpc error: code = NotFound desc = could not find container \"4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153\": container with ID starting with 4ba9eb43800604da97ff9066f60dcdf6e63d2eb7f93e3ed5553f1d4551221153 not found: ID does not exist" Jan 21 15:27:27 crc kubenswrapper[4773]: I0121 15:27:27.588333 4773 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","poddbec81d4-f8ff-45fa-b7f8-b29709e477bb"] err="unable to destroy cgroup paths for cgroup [kubepods burstable poddbec81d4-f8ff-45fa-b7f8-b29709e477bb] : Timed out while waiting for systemd to remove kubepods-burstable-poddbec81d4_f8ff_45fa_b7f8_b29709e477bb.slice" Jan 21 15:27:27 crc kubenswrapper[4773]: I0121 15:27:27.791244 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3e17bc3292c266026f1d8109699c4e66068037541d2566d8a9307a1979c45cbd"} Jan 21 15:27:27 crc kubenswrapper[4773]: I0121 15:27:27.792311 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:27 crc kubenswrapper[4773]: I0121 15:27:27.792426 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3bfb45ca84d56aaef4c67c9cabe4c12abb0e9323c48a0214e6616dcaaf889d7c"} Jan 21 15:27:27 crc kubenswrapper[4773]: I0121 15:27:27.791513 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:27 crc kubenswrapper[4773]: I0121 15:27:27.792596 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:28 crc kubenswrapper[4773]: I0121 15:27:28.400510 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:28 crc kubenswrapper[4773]: I0121 15:27:28.400882 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:28 crc kubenswrapper[4773]: I0121 15:27:28.405759 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]log ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]etcd ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/generic-apiserver-start-informers ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/priority-and-fairness-filter ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/start-apiextensions-informers ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/start-apiextensions-controllers ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/crd-informer-synced ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/start-system-namespaces-controller ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 21 15:27:28 crc kubenswrapper[4773]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/bootstrap-controller ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/start-kube-aggregator-informers ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/apiservice-registration-controller ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/apiservice-discovery-controller ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]autoregister-completion ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/apiservice-openapi-controller ok Jan 21 15:27:28 crc kubenswrapper[4773]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 21 15:27:28 crc kubenswrapper[4773]: livez check failed Jan 21 15:27:28 crc kubenswrapper[4773]: I0121 15:27:28.405818 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 15:27:29 crc kubenswrapper[4773]: I0121 15:27:29.206869 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:27:32 crc kubenswrapper[4773]: I0121 15:27:32.803340 4773 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:32 crc kubenswrapper[4773]: I0121 15:27:32.820096 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:32 crc kubenswrapper[4773]: I0121 15:27:32.820129 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:33 crc kubenswrapper[4773]: I0121 15:27:33.404685 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:33 crc kubenswrapper[4773]: I0121 15:27:33.411006 4773 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e6f0765c-ebba-4b7e-a574-2fe137ac005f" Jan 21 15:27:33 crc kubenswrapper[4773]: I0121 15:27:33.824480 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:33 crc kubenswrapper[4773]: I0121 15:27:33.824524 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:33 crc kubenswrapper[4773]: I0121 15:27:33.828837 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:34 crc kubenswrapper[4773]: I0121 15:27:34.335747 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:27:34 crc kubenswrapper[4773]: I0121 15:27:34.340562 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:27:34 crc kubenswrapper[4773]: I0121 15:27:34.829019 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:34 crc kubenswrapper[4773]: I0121 15:27:34.829336 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9aef2ac3-d17d-45d6-8cba-f0ab6da6b120" Jan 21 15:27:34 crc kubenswrapper[4773]: I0121 15:27:34.832100 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 15:27:35 crc kubenswrapper[4773]: I0121 15:27:35.414538 4773 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e6f0765c-ebba-4b7e-a574-2fe137ac005f" Jan 21 15:27:44 crc kubenswrapper[4773]: I0121 15:27:44.696326 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 21 15:27:45 crc kubenswrapper[4773]: I0121 15:27:45.504598 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 21 15:27:45 crc kubenswrapper[4773]: I0121 15:27:45.618183 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 15:27:45 crc kubenswrapper[4773]: I0121 15:27:45.726586 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 21 15:27:45 crc kubenswrapper[4773]: I0121 15:27:45.896870 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.078908 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.109531 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.115563 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.364908 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.403035 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.419062 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.524486 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.538998 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.548134 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.670405 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.681262 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.735380 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.794355 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.855766 4773 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 21 15:27:46 crc kubenswrapper[4773]: I0121 15:27:46.962676 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 15:27:47 crc kubenswrapper[4773]: I0121 15:27:47.122375 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 21 15:27:47 crc kubenswrapper[4773]: I0121 15:27:47.442652 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 21 15:27:47 crc kubenswrapper[4773]: I0121 15:27:47.446035 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 21 15:27:47 crc kubenswrapper[4773]: I0121 15:27:47.625616 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 21 15:27:47 crc kubenswrapper[4773]: I0121 15:27:47.847871 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 21 15:27:47 crc kubenswrapper[4773]: I0121 15:27:47.881169 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 21 15:27:47 crc kubenswrapper[4773]: I0121 15:27:47.955320 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 21 15:27:48 crc kubenswrapper[4773]: I0121 15:27:48.110373 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 15:27:48 crc kubenswrapper[4773]: I0121 15:27:48.158519 4773 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 21 15:27:48 crc kubenswrapper[4773]: I0121 15:27:48.542967 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 21 15:27:48 crc kubenswrapper[4773]: I0121 15:27:48.655413 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 21 15:27:48 crc kubenswrapper[4773]: I0121 15:27:48.662373 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 15:27:48 crc kubenswrapper[4773]: I0121 15:27:48.734950 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 21 15:27:48 crc kubenswrapper[4773]: I0121 15:27:48.869573 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 15:27:48 crc kubenswrapper[4773]: I0121 15:27:48.948187 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 21 15:27:49 crc kubenswrapper[4773]: I0121 15:27:49.012379 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 21 15:27:49 crc kubenswrapper[4773]: I0121 15:27:49.087015 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 21 15:27:49 crc kubenswrapper[4773]: I0121 15:27:49.155262 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 21 15:27:49 crc kubenswrapper[4773]: I0121 15:27:49.245194 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 21 15:27:49 crc kubenswrapper[4773]: I0121 15:27:49.245407 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 21 15:27:49 crc kubenswrapper[4773]: I0121 15:27:49.536871 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 15:27:49 crc kubenswrapper[4773]: I0121 15:27:49.588548 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 21 15:27:49 crc kubenswrapper[4773]: I0121 15:27:49.714817 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 21 15:27:49 crc kubenswrapper[4773]: I0121 15:27:49.728537 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 15:27:49 crc kubenswrapper[4773]: I0121 15:27:49.765981 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 15:27:49 crc kubenswrapper[4773]: I0121 15:27:49.827081 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 21 15:27:50 crc kubenswrapper[4773]: I0121 15:27:50.035530 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 21 15:27:50 crc kubenswrapper[4773]: I0121 15:27:50.070508 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 21 15:27:50 crc kubenswrapper[4773]: I0121 15:27:50.384390 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 21 15:27:50 crc kubenswrapper[4773]: I0121 15:27:50.411183 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 21 15:27:50 crc kubenswrapper[4773]: I0121 15:27:50.519086 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 21 15:27:50 crc kubenswrapper[4773]: I0121 15:27:50.583655 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 21 15:27:50 crc kubenswrapper[4773]: I0121 15:27:50.612966 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 21 15:27:50 crc kubenswrapper[4773]: I0121 15:27:50.648016 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 21 15:27:50 crc kubenswrapper[4773]: I0121 15:27:50.672479 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 21 15:27:51 crc kubenswrapper[4773]: I0121 15:27:51.049591 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 21 15:27:51 crc kubenswrapper[4773]: I0121 15:27:51.177220 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 21 15:27:51 crc kubenswrapper[4773]: I0121 15:27:51.278018 4773 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 21 15:27:51 crc kubenswrapper[4773]: I0121 15:27:51.458681 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 21 15:27:51 crc kubenswrapper[4773]: I0121 15:27:51.510763 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 15:27:51 crc kubenswrapper[4773]: I0121 15:27:51.580955 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 21 15:27:51 crc kubenswrapper[4773]: I0121 15:27:51.692929 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 15:27:51 crc kubenswrapper[4773]: I0121 15:27:51.796081 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 21 15:27:51 crc kubenswrapper[4773]: I0121 15:27:51.924514 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 21 15:27:51 crc kubenswrapper[4773]: I0121 15:27:51.975305 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 15:27:51 crc kubenswrapper[4773]: I0121 15:27:51.998921 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 21 15:27:52 crc kubenswrapper[4773]: I0121 15:27:52.202751 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 15:27:52 crc kubenswrapper[4773]: I0121 15:27:52.436637 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 21 15:27:52 crc kubenswrapper[4773]: I0121 15:27:52.479665 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 21 15:27:52 crc kubenswrapper[4773]: I0121 15:27:52.729580 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 21 15:27:52 crc kubenswrapper[4773]: I0121 15:27:52.770865 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 15:27:52 crc kubenswrapper[4773]: I0121 15:27:52.774395 4773 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 21 15:27:52 crc kubenswrapper[4773]: I0121 15:27:52.903324 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 21 15:27:52 crc kubenswrapper[4773]: I0121 15:27:52.959261 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 21 15:27:53 crc kubenswrapper[4773]: I0121 15:27:53.228875 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 21 15:27:53 crc kubenswrapper[4773]: I0121 15:27:53.303206 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 21 15:27:53 crc kubenswrapper[4773]: I0121 15:27:53.398255 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 15:27:53 crc kubenswrapper[4773]: I0121 15:27:53.613431 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 21 15:27:53 crc kubenswrapper[4773]: I0121 15:27:53.961253 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 21 15:27:54 crc kubenswrapper[4773]: I0121 15:27:54.116476 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 15:27:54 crc kubenswrapper[4773]: I0121 15:27:54.147111 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 15:27:54 crc kubenswrapper[4773]: I0121 15:27:54.206338 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 21 15:27:54 crc kubenswrapper[4773]: I0121 15:27:54.207281 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 21 15:27:54 crc kubenswrapper[4773]: I0121 15:27:54.340495 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 21 15:27:54 crc kubenswrapper[4773]: I0121 15:27:54.673956 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 21 15:27:54 crc kubenswrapper[4773]: I0121 15:27:54.829435 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 21 15:27:54 crc kubenswrapper[4773]: I0121 15:27:54.897368 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 21 15:27:54 crc kubenswrapper[4773]: I0121 15:27:54.931894 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 21 15:27:54 crc kubenswrapper[4773]: I0121 15:27:54.945647 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.193303 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.195966 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.441475 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.475622 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.634449 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.886822 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.891526 4773 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.894777 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=50.894760014 podStartE2EDuration="50.894760014s" podCreationTimestamp="2026-01-21 15:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:32.427100852 +0000 UTC m=+217.351590474" watchObservedRunningTime="2026-01-21 15:27:55.894760014 +0000 UTC m=+240.819249636" Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.896544 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-lt258"] Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.896604 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.902096 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.917046 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.917024682 podStartE2EDuration="23.917024682s" podCreationTimestamp="2026-01-21 15:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:27:55.9162906 +0000 UTC m=+240.840780212" watchObservedRunningTime="2026-01-21 15:27:55.917024682 +0000 UTC m=+240.841514304" Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.928909 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 21 15:27:55 crc kubenswrapper[4773]: I0121 15:27:55.965226 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 21 15:27:56 crc kubenswrapper[4773]: I0121 15:27:56.014821 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 15:27:56 crc kubenswrapper[4773]: I0121 15:27:56.132836 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 21 15:27:56 crc kubenswrapper[4773]: I0121 15:27:56.267399 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 21 15:27:56 crc kubenswrapper[4773]: I0121 15:27:56.336252 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 21 15:27:56 crc kubenswrapper[4773]: I0121 15:27:56.356909 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 21 15:27:56 crc kubenswrapper[4773]: I0121 15:27:56.430625 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 21 15:27:56 crc kubenswrapper[4773]: I0121 15:27:56.568260 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 15:27:56 crc kubenswrapper[4773]: I0121 15:27:56.878725 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 21 15:27:56 crc kubenswrapper[4773]: I0121 15:27:56.882558 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 21 15:27:56 crc kubenswrapper[4773]: I0121 15:27:56.969016 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 21 15:27:57 crc kubenswrapper[4773]: I0121 15:27:57.082802 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 21 15:27:57 crc kubenswrapper[4773]: I0121 15:27:57.137718 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 21 15:27:57 crc kubenswrapper[4773]: I0121 15:27:57.197470 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 21 15:27:57 crc kubenswrapper[4773]: I0121 15:27:57.275848 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 21 15:27:57 crc kubenswrapper[4773]: I0121 15:27:57.393714 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" path="/var/lib/kubelet/pods/3823c3bf-ed3b-4ad7-a537-99c12d14bc4b/volumes" Jan 21 15:27:57 crc kubenswrapper[4773]: I0121 15:27:57.397473 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 21 15:27:57 crc kubenswrapper[4773]: I0121 15:27:57.500022 4773 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 21 15:27:57 crc kubenswrapper[4773]: I0121 15:27:57.650953 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 21 15:27:58 crc kubenswrapper[4773]: I0121 15:27:58.096422 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 21 15:27:58 crc kubenswrapper[4773]: I0121 15:27:58.188284 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 21 15:28:03 crc kubenswrapper[4773]: I0121 15:28:03.457631 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 15:28:03 crc kubenswrapper[4773]: I0121 15:28:03.901070 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 21 15:28:05 crc kubenswrapper[4773]: I0121 15:28:05.359418 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 15:28:06 crc kubenswrapper[4773]: I0121 15:28:06.460582 4773 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 15:28:06 crc kubenswrapper[4773]: I0121 15:28:06.460835 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700" gracePeriod=5 Jan 21 15:28:07 crc kubenswrapper[4773]: I0121 15:28:07.502895 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 21 15:28:07 crc kubenswrapper[4773]: I0121 15:28:07.888720 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 21 15:28:07 crc kubenswrapper[4773]: I0121 15:28:07.990585 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 21 15:28:08 crc kubenswrapper[4773]: I0121 15:28:08.106414 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 15:28:08 crc kubenswrapper[4773]: I0121 15:28:08.450249 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 21 15:28:08 crc kubenswrapper[4773]: I0121 15:28:08.886213 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 21 15:28:09 crc kubenswrapper[4773]: I0121 15:28:09.369315 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 21 15:28:09 crc kubenswrapper[4773]: I0121 15:28:09.475373 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 21 15:28:09 crc kubenswrapper[4773]: I0121 15:28:09.685068 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 21 15:28:10 crc kubenswrapper[4773]: I0121 15:28:10.424436 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 21 15:28:10 crc kubenswrapper[4773]: I0121 15:28:10.664176 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.039633 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.039732 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.051468 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.052418 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.052475 4773 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700" exitCode=137 Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.052524 4773 scope.go:117] "RemoveContainer" containerID="234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.052555 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.068832 4773 scope.go:117] "RemoveContainer" containerID="234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700" Jan 21 15:28:12 crc kubenswrapper[4773]: E0121 15:28:12.069323 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700\": container with ID starting with 234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700 not found: ID does not exist" containerID="234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.069367 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700"} err="failed to get container status \"234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700\": rpc error: code = NotFound desc = could not find container \"234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700\": container with ID starting with 234f5d76bb91f1586919ef601490fae5a4ff8360211ee1328f0f7ff930afc700 not found: ID does not exist" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104139 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104202 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104262 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104263 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104324 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104355 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104372 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104395 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104530 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104611 4773 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104625 4773 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104632 4773 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.104639 4773 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.113851 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.205649 4773 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.295865 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.486524 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 21 15:28:12 crc kubenswrapper[4773]: I0121 15:28:12.851947 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 21 15:28:13 crc kubenswrapper[4773]: I0121 15:28:13.047521 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 21 15:28:13 crc kubenswrapper[4773]: I0121 15:28:13.329129 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 21 15:28:13 crc kubenswrapper[4773]: I0121 15:28:13.390743 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 21 15:28:13 crc kubenswrapper[4773]: I0121 15:28:13.391116 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 21 15:28:13 crc kubenswrapper[4773]: I0121 15:28:13.409868 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 15:28:13 crc kubenswrapper[4773]: I0121 15:28:13.409914 4773 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8cf0c8cf-aeb8-46ed-a7f6-6261593f5752" Jan 21 15:28:13 crc kubenswrapper[4773]: I0121 15:28:13.413585 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 15:28:13 crc kubenswrapper[4773]: I0121 15:28:13.413611 4773 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8cf0c8cf-aeb8-46ed-a7f6-6261593f5752" Jan 21 15:28:13 crc kubenswrapper[4773]: I0121 15:28:13.546848 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 21 15:28:13 crc kubenswrapper[4773]: I0121 15:28:13.588388 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 15:28:13 crc kubenswrapper[4773]: I0121 15:28:13.807936 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 21 15:28:14 crc kubenswrapper[4773]: I0121 15:28:14.207412 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 21 15:28:14 crc kubenswrapper[4773]: I0121 15:28:14.409844 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 21 15:28:14 crc kubenswrapper[4773]: I0121 15:28:14.429205 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 21 15:28:14 crc kubenswrapper[4773]: I0121 15:28:14.571377 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 21 15:28:14 crc kubenswrapper[4773]: I0121 15:28:14.690890 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 21 15:28:14 crc kubenswrapper[4773]: I0121 15:28:14.878211 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 21 15:28:14 crc kubenswrapper[4773]: I0121 15:28:14.917338 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 21 15:28:15 crc kubenswrapper[4773]: I0121 15:28:15.137791 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 21 15:28:15 crc kubenswrapper[4773]: I0121 15:28:15.159334 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 21 15:28:15 crc kubenswrapper[4773]: I0121 15:28:15.514784 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 21 15:28:15 crc kubenswrapper[4773]: I0121 15:28:15.556356 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 21 15:28:15 crc kubenswrapper[4773]: I0121 15:28:15.783277 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 21 15:28:16 crc kubenswrapper[4773]: I0121 15:28:16.112035 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 15:28:16 crc kubenswrapper[4773]: I0121 15:28:16.663386 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 21 15:28:17 crc kubenswrapper[4773]: I0121 15:28:17.064449 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 21 15:28:17 crc kubenswrapper[4773]: I0121 15:28:17.085461 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 21 15:28:17 crc kubenswrapper[4773]: I0121 15:28:17.276722 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 15:28:17 crc kubenswrapper[4773]: I0121 15:28:17.753569 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 21 15:28:17 crc kubenswrapper[4773]: I0121 15:28:17.762328 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 21 15:28:18 crc kubenswrapper[4773]: I0121 15:28:18.093418 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 21 15:28:18 crc kubenswrapper[4773]: I0121 15:28:18.096042 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 21 15:28:18 crc kubenswrapper[4773]: I0121 15:28:18.181427 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 21 15:28:18 crc kubenswrapper[4773]: I0121 15:28:18.610711 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 21 15:28:19 crc kubenswrapper[4773]: I0121 15:28:19.038685 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 21 15:28:19 crc kubenswrapper[4773]: I0121 15:28:19.493106 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-89m48"] Jan 21 15:28:19 crc kubenswrapper[4773]: I0121 15:28:19.493359 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" podUID="18d99fca-5145-431e-8bf1-8934b783b569" containerName="controller-manager" containerID="cri-o://b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff" gracePeriod=30 Jan 21 15:28:19 crc kubenswrapper[4773]: I0121 15:28:19.500408 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r"] Jan 21 15:28:19 crc kubenswrapper[4773]: I0121 15:28:19.500769 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" podUID="69e4506e-0adb-495a-b22d-ff5ac9e79afa" containerName="route-controller-manager" containerID="cri-o://d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613" gracePeriod=30 Jan 21 15:28:19 crc kubenswrapper[4773]: I0121 15:28:19.903414 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 21 15:28:20 crc kubenswrapper[4773]: I0121 15:28:20.134522 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 21 15:28:20 crc kubenswrapper[4773]: I0121 15:28:20.212619 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 21 15:28:20 crc kubenswrapper[4773]: I0121 15:28:20.413226 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 21 15:28:20 crc kubenswrapper[4773]: I0121 15:28:20.491144 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 21 15:28:20 crc kubenswrapper[4773]: I0121 15:28:20.793721 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 21 15:28:20 crc kubenswrapper[4773]: I0121 15:28:20.897438 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 15:28:20 crc kubenswrapper[4773]: I0121 15:28:20.960405 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.064373 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.065549 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.070683 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.102187 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.107568 4773 generic.go:334] "Generic (PLEG): container finished" podID="18d99fca-5145-431e-8bf1-8934b783b569" containerID="b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff" exitCode=0 Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.107623 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.107636 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" event={"ID":"18d99fca-5145-431e-8bf1-8934b783b569","Type":"ContainerDied","Data":"b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff"} Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.107737 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-89m48" event={"ID":"18d99fca-5145-431e-8bf1-8934b783b569","Type":"ContainerDied","Data":"e3a53f51ce3ed98af3f9faa00e5cf1e68e09d0379a5aa02db094840b85e299ff"} Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.107759 4773 scope.go:117] "RemoveContainer" containerID="b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.111499 4773 generic.go:334] "Generic (PLEG): container finished" podID="69e4506e-0adb-495a-b22d-ff5ac9e79afa" containerID="d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613" exitCode=0 Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.111531 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.111539 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" event={"ID":"69e4506e-0adb-495a-b22d-ff5ac9e79afa","Type":"ContainerDied","Data":"d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613"} Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.111566 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r" event={"ID":"69e4506e-0adb-495a-b22d-ff5ac9e79afa","Type":"ContainerDied","Data":"2970ef81738e0d6320825f7b79a2d6ec41dccc127a126e46defcaf2f5a876e52"} Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.115644 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18d99fca-5145-431e-8bf1-8934b783b569-serving-cert\") pod \"18d99fca-5145-431e-8bf1-8934b783b569\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.115684 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e4506e-0adb-495a-b22d-ff5ac9e79afa-serving-cert\") pod \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.115722 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-config\") pod \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.115740 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-client-ca\") pod \"18d99fca-5145-431e-8bf1-8934b783b569\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.115767 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-config\") pod \"18d99fca-5145-431e-8bf1-8934b783b569\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.115790 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqbgs\" (UniqueName: \"kubernetes.io/projected/18d99fca-5145-431e-8bf1-8934b783b569-kube-api-access-qqbgs\") pod \"18d99fca-5145-431e-8bf1-8934b783b569\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.115812 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv648\" (UniqueName: \"kubernetes.io/projected/69e4506e-0adb-495a-b22d-ff5ac9e79afa-kube-api-access-sv648\") pod \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.116722 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-client-ca" (OuterVolumeSpecName: "client-ca") pod "18d99fca-5145-431e-8bf1-8934b783b569" (UID: "18d99fca-5145-431e-8bf1-8934b783b569"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.116744 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-client-ca\") pod \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\" (UID: \"69e4506e-0adb-495a-b22d-ff5ac9e79afa\") " Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.116781 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-proxy-ca-bundles\") pod \"18d99fca-5145-431e-8bf1-8934b783b569\" (UID: \"18d99fca-5145-431e-8bf1-8934b783b569\") " Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.116732 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-client-ca" (OuterVolumeSpecName: "client-ca") pod "69e4506e-0adb-495a-b22d-ff5ac9e79afa" (UID: "69e4506e-0adb-495a-b22d-ff5ac9e79afa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.116917 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-config" (OuterVolumeSpecName: "config") pod "18d99fca-5145-431e-8bf1-8934b783b569" (UID: "18d99fca-5145-431e-8bf1-8934b783b569"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.117194 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-config" (OuterVolumeSpecName: "config") pod "69e4506e-0adb-495a-b22d-ff5ac9e79afa" (UID: "69e4506e-0adb-495a-b22d-ff5ac9e79afa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.117235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "18d99fca-5145-431e-8bf1-8934b783b569" (UID: "18d99fca-5145-431e-8bf1-8934b783b569"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.117279 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.117291 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.117299 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.117306 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69e4506e-0adb-495a-b22d-ff5ac9e79afa-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.117314 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18d99fca-5145-431e-8bf1-8934b783b569-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.121082 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d99fca-5145-431e-8bf1-8934b783b569-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "18d99fca-5145-431e-8bf1-8934b783b569" (UID: "18d99fca-5145-431e-8bf1-8934b783b569"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.124899 4773 scope.go:117] "RemoveContainer" containerID="b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff" Jan 21 15:28:21 crc kubenswrapper[4773]: E0121 15:28:21.125348 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff\": container with ID starting with b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff not found: ID does not exist" containerID="b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.125386 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff"} err="failed to get container status \"b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff\": rpc error: code = NotFound desc = could not find container \"b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff\": container with ID starting with b18a5cd56e56175de0b8ea57164ff0a3e149a784cc69bcf73c86a33a233073ff not found: ID does not exist" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.125414 4773 scope.go:117] "RemoveContainer" containerID="d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.128067 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d99fca-5145-431e-8bf1-8934b783b569-kube-api-access-qqbgs" (OuterVolumeSpecName: "kube-api-access-qqbgs") pod "18d99fca-5145-431e-8bf1-8934b783b569" (UID: "18d99fca-5145-431e-8bf1-8934b783b569"). InnerVolumeSpecName "kube-api-access-qqbgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.128635 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e4506e-0adb-495a-b22d-ff5ac9e79afa-kube-api-access-sv648" (OuterVolumeSpecName: "kube-api-access-sv648") pod "69e4506e-0adb-495a-b22d-ff5ac9e79afa" (UID: "69e4506e-0adb-495a-b22d-ff5ac9e79afa"). InnerVolumeSpecName "kube-api-access-sv648". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.128648 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e4506e-0adb-495a-b22d-ff5ac9e79afa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "69e4506e-0adb-495a-b22d-ff5ac9e79afa" (UID: "69e4506e-0adb-495a-b22d-ff5ac9e79afa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.170390 4773 scope.go:117] "RemoveContainer" containerID="d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613" Jan 21 15:28:21 crc kubenswrapper[4773]: E0121 15:28:21.170759 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613\": container with ID starting with d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613 not found: ID does not exist" containerID="d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.170804 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613"} err="failed to get container status \"d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613\": rpc error: code = NotFound desc = could not find container \"d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613\": container with ID starting with d862aaae64eb7c964ef96c2dc2706f1c0402b10bdfcb0b16e4c0f650d4d34613 not found: ID does not exist" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.178236 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.218650 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18d99fca-5145-431e-8bf1-8934b783b569-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.218690 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e4506e-0adb-495a-b22d-ff5ac9e79afa-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.218722 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqbgs\" (UniqueName: \"kubernetes.io/projected/18d99fca-5145-431e-8bf1-8934b783b569-kube-api-access-qqbgs\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.218731 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv648\" (UniqueName: \"kubernetes.io/projected/69e4506e-0adb-495a-b22d-ff5ac9e79afa-kube-api-access-sv648\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.425298 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-89m48"] Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.434081 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-89m48"] Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.443293 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r"] Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.451745 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6x99r"] Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.484940 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.846098 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.872855 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 21 15:28:21 crc kubenswrapper[4773]: I0121 15:28:21.891001 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 21 15:28:22 crc kubenswrapper[4773]: I0121 15:28:22.172079 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 21 15:28:22 crc kubenswrapper[4773]: I0121 15:28:22.310148 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 21 15:28:22 crc kubenswrapper[4773]: I0121 15:28:22.649304 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 21 15:28:22 crc kubenswrapper[4773]: I0121 15:28:22.713965 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 21 15:28:22 crc kubenswrapper[4773]: I0121 15:28:22.789402 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 21 15:28:23 crc kubenswrapper[4773]: I0121 15:28:23.032206 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 21 15:28:23 crc kubenswrapper[4773]: I0121 15:28:23.148590 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 21 15:28:23 crc kubenswrapper[4773]: I0121 15:28:23.342599 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 21 15:28:23 crc kubenswrapper[4773]: I0121 15:28:23.391021 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d99fca-5145-431e-8bf1-8934b783b569" path="/var/lib/kubelet/pods/18d99fca-5145-431e-8bf1-8934b783b569/volumes" Jan 21 15:28:23 crc kubenswrapper[4773]: I0121 15:28:23.391533 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e4506e-0adb-495a-b22d-ff5ac9e79afa" path="/var/lib/kubelet/pods/69e4506e-0adb-495a-b22d-ff5ac9e79afa/volumes" Jan 21 15:28:24 crc kubenswrapper[4773]: I0121 15:28:24.264022 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 21 15:28:24 crc kubenswrapper[4773]: I0121 15:28:24.295037 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 21 15:28:24 crc kubenswrapper[4773]: I0121 15:28:24.428322 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 21 15:28:24 crc kubenswrapper[4773]: I0121 15:28:24.546491 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 21 15:28:25 crc kubenswrapper[4773]: I0121 15:28:25.193338 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 21 15:28:25 crc kubenswrapper[4773]: I0121 15:28:25.877527 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 21 15:28:25 crc kubenswrapper[4773]: I0121 15:28:25.908677 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 15:28:26 crc kubenswrapper[4773]: I0121 15:28:26.002668 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 15:28:26 crc kubenswrapper[4773]: I0121 15:28:26.048584 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 21 15:28:26 crc kubenswrapper[4773]: I0121 15:28:26.497307 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 15:28:26 crc kubenswrapper[4773]: I0121 15:28:26.528135 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 21 15:28:26 crc kubenswrapper[4773]: I0121 15:28:26.709160 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 21 15:28:26 crc kubenswrapper[4773]: I0121 15:28:26.833400 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 21 15:28:26 crc kubenswrapper[4773]: I0121 15:28:26.840006 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 21 15:28:26 crc kubenswrapper[4773]: I0121 15:28:26.862809 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 15:28:26 crc kubenswrapper[4773]: I0121 15:28:26.892395 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.019581 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.378992 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.603122 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.795139 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.870745 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j"] Jan 21 15:28:27 crc kubenswrapper[4773]: E0121 15:28:27.870989 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.871001 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 15:28:27 crc kubenswrapper[4773]: E0121 15:28:27.871015 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" containerName="oauth-openshift" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.871021 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" containerName="oauth-openshift" Jan 21 15:28:27 crc kubenswrapper[4773]: E0121 15:28:27.871032 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e4506e-0adb-495a-b22d-ff5ac9e79afa" containerName="route-controller-manager" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.871039 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e4506e-0adb-495a-b22d-ff5ac9e79afa" containerName="route-controller-manager" Jan 21 15:28:27 crc kubenswrapper[4773]: E0121 15:28:27.871052 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d99fca-5145-431e-8bf1-8934b783b569" containerName="controller-manager" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.871058 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d99fca-5145-431e-8bf1-8934b783b569" containerName="controller-manager" Jan 21 15:28:27 crc kubenswrapper[4773]: E0121 15:28:27.871066 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71445799-2ffe-4318-8753-3b8801b2db52" containerName="installer" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.871072 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="71445799-2ffe-4318-8753-3b8801b2db52" containerName="installer" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.871190 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e4506e-0adb-495a-b22d-ff5ac9e79afa" containerName="route-controller-manager" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.871206 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d99fca-5145-431e-8bf1-8934b783b569" containerName="controller-manager" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.871212 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3823c3bf-ed3b-4ad7-a537-99c12d14bc4b" containerName="oauth-openshift" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.871218 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="71445799-2ffe-4318-8753-3b8801b2db52" containerName="installer" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.871225 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.871611 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.873989 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.874023 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.873988 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.874430 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.874468 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.874709 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.875718 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6b9699fff8-sjf92"] Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.876773 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.881019 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j"] Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.881160 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.881188 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.881247 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.881244 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.881628 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.881659 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.881951 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.881979 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.881958 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.882026 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.883717 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.884841 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.885666 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b9699fff8-sjf92"] Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.889611 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.890585 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895290 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895301 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895335 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddpfc\" (UniqueName: \"kubernetes.io/projected/32656903-ca62-4c85-a677-906ecd1a31bb-kube-api-access-ddpfc\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895379 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-client-ca\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895401 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895422 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-config\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895519 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895577 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895610 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895667 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895686 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895775 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32656903-ca62-4c85-a677-906ecd1a31bb-serving-cert\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895814 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895848 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895901 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-audit-policies\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895930 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895956 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895975 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0bb379b-3861-47d3-a55f-7851caac997e-audit-dir\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.895998 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5lgz\" (UniqueName: \"kubernetes.io/projected/a0bb379b-3861-47d3-a55f-7851caac997e-kube-api-access-r5lgz\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.996854 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.996903 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-audit-policies\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.996923 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.996942 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.996961 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0bb379b-3861-47d3-a55f-7851caac997e-audit-dir\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.996979 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5lgz\" (UniqueName: \"kubernetes.io/projected/a0bb379b-3861-47d3-a55f-7851caac997e-kube-api-access-r5lgz\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.996999 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.997014 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddpfc\" (UniqueName: \"kubernetes.io/projected/32656903-ca62-4c85-a677-906ecd1a31bb-kube-api-access-ddpfc\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.997034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-client-ca\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.997051 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.997070 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-config\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.997103 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0bb379b-3861-47d3-a55f-7851caac997e-audit-dir\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.997890 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.997970 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.997998 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.998404 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.998439 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.998468 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32656903-ca62-4c85-a677-906ecd1a31bb-serving-cert\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:27 crc kubenswrapper[4773]: I0121 15:28:27.998497 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.001162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-audit-policies\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.001237 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.001635 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.001997 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-config\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.002506 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.002633 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-client-ca\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.004664 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.004951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32656903-ca62-4c85-a677-906ecd1a31bb-serving-cert\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.006150 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.006391 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.006796 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.006867 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.007097 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.008954 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.013333 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a0bb379b-3861-47d3-a55f-7851caac997e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.013966 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddpfc\" (UniqueName: \"kubernetes.io/projected/32656903-ca62-4c85-a677-906ecd1a31bb-kube-api-access-ddpfc\") pod \"route-controller-manager-f68d7958-92d2j\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.014162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5lgz\" (UniqueName: \"kubernetes.io/projected/a0bb379b-3861-47d3-a55f-7851caac997e-kube-api-access-r5lgz\") pod \"oauth-openshift-6b9699fff8-sjf92\" (UID: \"a0bb379b-3861-47d3-a55f-7851caac997e\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.200198 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.206498 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.390872 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j"] Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.441414 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b9699fff8-sjf92"] Jan 21 15:28:28 crc kubenswrapper[4773]: I0121 15:28:28.749969 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.102932 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.156299 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" event={"ID":"a0bb379b-3861-47d3-a55f-7851caac997e","Type":"ContainerStarted","Data":"0a047654ea152f8d809f65598f8ad6040caa26f1e544e51c4e03ee72cd3492a5"} Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.156355 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.156368 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" event={"ID":"a0bb379b-3861-47d3-a55f-7851caac997e","Type":"ContainerStarted","Data":"dde18940c2bfb77de54ab1c0fd1680a49e63760ffe0a091e9634d845c11cdfef"} Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.157886 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" event={"ID":"32656903-ca62-4c85-a677-906ecd1a31bb","Type":"ContainerStarted","Data":"518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3"} Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.157917 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" event={"ID":"32656903-ca62-4c85-a677-906ecd1a31bb","Type":"ContainerStarted","Data":"74eb48f9ba6408e3f216146332b7c0c7076b4e5eb3c864c669668268ccfb10d6"} Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.158112 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.162825 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.164215 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.182499 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6b9699fff8-sjf92" podStartSLOduration=89.182476872 podStartE2EDuration="1m29.182476872s" podCreationTimestamp="2026-01-21 15:27:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:29.175988453 +0000 UTC m=+274.100478085" watchObservedRunningTime="2026-01-21 15:28:29.182476872 +0000 UTC m=+274.106966494" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.219758 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.233212 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" podStartSLOduration=10.23318775 podStartE2EDuration="10.23318775s" podCreationTimestamp="2026-01-21 15:28:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:29.229945864 +0000 UTC m=+274.154435506" watchObservedRunningTime="2026-01-21 15:28:29.23318775 +0000 UTC m=+274.157677372" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.316585 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.392368 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9"] Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.393111 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.395462 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.395677 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.395740 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.396095 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.396299 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.396511 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.404035 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9"] Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.406547 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.421537 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zttm8\" (UniqueName: \"kubernetes.io/projected/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-kube-api-access-zttm8\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.421619 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-proxy-ca-bundles\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.421664 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-config\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.421727 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-serving-cert\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.421767 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-client-ca\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.522252 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-serving-cert\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.522315 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-client-ca\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.522356 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zttm8\" (UniqueName: \"kubernetes.io/projected/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-kube-api-access-zttm8\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.522380 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-proxy-ca-bundles\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.522399 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-config\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.523483 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-client-ca\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.523654 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-proxy-ca-bundles\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.523895 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-config\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.529271 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-serving-cert\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.539159 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zttm8\" (UniqueName: \"kubernetes.io/projected/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-kube-api-access-zttm8\") pod \"controller-manager-7f66dc4f94-k8sh9\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.585504 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.652927 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.711937 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:29 crc kubenswrapper[4773]: I0121 15:28:29.889280 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9"] Jan 21 15:28:29 crc kubenswrapper[4773]: W0121 15:28:29.896408 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6158429c_14e0_4faf_b1c5_cbbf5b4ac70d.slice/crio-197f356cd5ef72e1784b633d7c7f189494bd34a799c62671e8b07cea554f2a72 WatchSource:0}: Error finding container 197f356cd5ef72e1784b633d7c7f189494bd34a799c62671e8b07cea554f2a72: Status 404 returned error can't find the container with id 197f356cd5ef72e1784b633d7c7f189494bd34a799c62671e8b07cea554f2a72 Jan 21 15:28:30 crc kubenswrapper[4773]: I0121 15:28:30.167604 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" event={"ID":"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d","Type":"ContainerStarted","Data":"83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9"} Jan 21 15:28:30 crc kubenswrapper[4773]: I0121 15:28:30.167974 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" event={"ID":"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d","Type":"ContainerStarted","Data":"197f356cd5ef72e1784b633d7c7f189494bd34a799c62671e8b07cea554f2a72"} Jan 21 15:28:30 crc kubenswrapper[4773]: I0121 15:28:30.199234 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" podStartSLOduration=11.199215147 podStartE2EDuration="11.199215147s" podCreationTimestamp="2026-01-21 15:28:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:30.19588366 +0000 UTC m=+275.120373282" watchObservedRunningTime="2026-01-21 15:28:30.199215147 +0000 UTC m=+275.123704769" Jan 21 15:28:30 crc kubenswrapper[4773]: I0121 15:28:30.379213 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 21 15:28:30 crc kubenswrapper[4773]: I0121 15:28:30.587484 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 21 15:28:30 crc kubenswrapper[4773]: I0121 15:28:30.633373 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 15:28:31 crc kubenswrapper[4773]: I0121 15:28:31.020218 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9"] Jan 21 15:28:31 crc kubenswrapper[4773]: I0121 15:28:31.039247 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j"] Jan 21 15:28:31 crc kubenswrapper[4773]: I0121 15:28:31.172989 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:31 crc kubenswrapper[4773]: I0121 15:28:31.177546 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:31 crc kubenswrapper[4773]: I0121 15:28:31.819801 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.178281 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" podUID="6158429c-14e0-4faf-b1c5-cbbf5b4ac70d" containerName="controller-manager" containerID="cri-o://83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9" gracePeriod=30 Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.178236 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" podUID="32656903-ca62-4c85-a677-906ecd1a31bb" containerName="route-controller-manager" containerID="cri-o://518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3" gracePeriod=30 Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.480727 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.607727 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.635934 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd"] Jan 21 15:28:32 crc kubenswrapper[4773]: E0121 15:28:32.636165 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32656903-ca62-4c85-a677-906ecd1a31bb" containerName="route-controller-manager" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.636187 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="32656903-ca62-4c85-a677-906ecd1a31bb" containerName="route-controller-manager" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.636335 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="32656903-ca62-4c85-a677-906ecd1a31bb" containerName="route-controller-manager" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.636882 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.652985 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.653367 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd"] Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699145 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-client-ca\") pod \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699221 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32656903-ca62-4c85-a677-906ecd1a31bb-serving-cert\") pod \"32656903-ca62-4c85-a677-906ecd1a31bb\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699256 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-config\") pod \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699291 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-proxy-ca-bundles\") pod \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699334 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-config\") pod \"32656903-ca62-4c85-a677-906ecd1a31bb\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699371 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-client-ca\") pod \"32656903-ca62-4c85-a677-906ecd1a31bb\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699438 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddpfc\" (UniqueName: \"kubernetes.io/projected/32656903-ca62-4c85-a677-906ecd1a31bb-kube-api-access-ddpfc\") pod \"32656903-ca62-4c85-a677-906ecd1a31bb\" (UID: \"32656903-ca62-4c85-a677-906ecd1a31bb\") " Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699505 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zttm8\" (UniqueName: \"kubernetes.io/projected/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-kube-api-access-zttm8\") pod \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699530 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-serving-cert\") pod \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\" (UID: \"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d\") " Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699684 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-client-ca\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699807 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-serving-cert\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699859 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-config\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699892 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm9p5\" (UniqueName: \"kubernetes.io/projected/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-kube-api-access-sm9p5\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.699979 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-client-ca" (OuterVolumeSpecName: "client-ca") pod "6158429c-14e0-4faf-b1c5-cbbf5b4ac70d" (UID: "6158429c-14e0-4faf-b1c5-cbbf5b4ac70d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.700442 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6158429c-14e0-4faf-b1c5-cbbf5b4ac70d" (UID: "6158429c-14e0-4faf-b1c5-cbbf5b4ac70d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.700553 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-config" (OuterVolumeSpecName: "config") pod "32656903-ca62-4c85-a677-906ecd1a31bb" (UID: "32656903-ca62-4c85-a677-906ecd1a31bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.700576 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-config" (OuterVolumeSpecName: "config") pod "6158429c-14e0-4faf-b1c5-cbbf5b4ac70d" (UID: "6158429c-14e0-4faf-b1c5-cbbf5b4ac70d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.700885 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-client-ca" (OuterVolumeSpecName: "client-ca") pod "32656903-ca62-4c85-a677-906ecd1a31bb" (UID: "32656903-ca62-4c85-a677-906ecd1a31bb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.705088 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32656903-ca62-4c85-a677-906ecd1a31bb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32656903-ca62-4c85-a677-906ecd1a31bb" (UID: "32656903-ca62-4c85-a677-906ecd1a31bb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.705348 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6158429c-14e0-4faf-b1c5-cbbf5b4ac70d" (UID: "6158429c-14e0-4faf-b1c5-cbbf5b4ac70d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.705578 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32656903-ca62-4c85-a677-906ecd1a31bb-kube-api-access-ddpfc" (OuterVolumeSpecName: "kube-api-access-ddpfc") pod "32656903-ca62-4c85-a677-906ecd1a31bb" (UID: "32656903-ca62-4c85-a677-906ecd1a31bb"). InnerVolumeSpecName "kube-api-access-ddpfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.705976 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-kube-api-access-zttm8" (OuterVolumeSpecName: "kube-api-access-zttm8") pod "6158429c-14e0-4faf-b1c5-cbbf5b4ac70d" (UID: "6158429c-14e0-4faf-b1c5-cbbf5b4ac70d"). InnerVolumeSpecName "kube-api-access-zttm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.757378 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800620 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-client-ca\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800709 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-serving-cert\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800745 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-config\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800768 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm9p5\" (UniqueName: \"kubernetes.io/projected/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-kube-api-access-sm9p5\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800805 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zttm8\" (UniqueName: \"kubernetes.io/projected/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-kube-api-access-zttm8\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800816 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800824 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800832 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32656903-ca62-4c85-a677-906ecd1a31bb-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800841 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800849 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800857 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800865 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32656903-ca62-4c85-a677-906ecd1a31bb-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.800874 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddpfc\" (UniqueName: \"kubernetes.io/projected/32656903-ca62-4c85-a677-906ecd1a31bb-kube-api-access-ddpfc\") on node \"crc\" DevicePath \"\"" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.801811 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-client-ca\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.802030 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-config\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.804679 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-serving-cert\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.823756 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm9p5\" (UniqueName: \"kubernetes.io/projected/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-kube-api-access-sm9p5\") pod \"route-controller-manager-74d76d5567-njpwd\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:32 crc kubenswrapper[4773]: I0121 15:28:32.963512 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.184370 4773 generic.go:334] "Generic (PLEG): container finished" podID="32656903-ca62-4c85-a677-906ecd1a31bb" containerID="518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3" exitCode=0 Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.184438 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.184451 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" event={"ID":"32656903-ca62-4c85-a677-906ecd1a31bb","Type":"ContainerDied","Data":"518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3"} Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.184479 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j" event={"ID":"32656903-ca62-4c85-a677-906ecd1a31bb","Type":"ContainerDied","Data":"74eb48f9ba6408e3f216146332b7c0c7076b4e5eb3c864c669668268ccfb10d6"} Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.184515 4773 scope.go:117] "RemoveContainer" containerID="518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.186173 4773 generic.go:334] "Generic (PLEG): container finished" podID="6158429c-14e0-4faf-b1c5-cbbf5b4ac70d" containerID="83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9" exitCode=0 Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.186192 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" event={"ID":"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d","Type":"ContainerDied","Data":"83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9"} Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.186207 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" event={"ID":"6158429c-14e0-4faf-b1c5-cbbf5b4ac70d","Type":"ContainerDied","Data":"197f356cd5ef72e1784b633d7c7f189494bd34a799c62671e8b07cea554f2a72"} Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.186253 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.202081 4773 scope.go:117] "RemoveContainer" containerID="518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3" Jan 21 15:28:33 crc kubenswrapper[4773]: E0121 15:28:33.202933 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3\": container with ID starting with 518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3 not found: ID does not exist" containerID="518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.203021 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3"} err="failed to get container status \"518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3\": rpc error: code = NotFound desc = could not find container \"518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3\": container with ID starting with 518c1ecc539d946e536a21610ff76ce2b1cd090c9e53a9624c7c5904cdc8a9e3 not found: ID does not exist" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.203191 4773 scope.go:117] "RemoveContainer" containerID="83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.216829 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j"] Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.222034 4773 scope.go:117] "RemoveContainer" containerID="83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9" Jan 21 15:28:33 crc kubenswrapper[4773]: E0121 15:28:33.222413 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9\": container with ID starting with 83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9 not found: ID does not exist" containerID="83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.222449 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9"} err="failed to get container status \"83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9\": rpc error: code = NotFound desc = could not find container \"83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9\": container with ID starting with 83b78a5c910dd10b4f0e7d16620f233383304baaa8837ebc047d309839bfecf9 not found: ID does not exist" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.223514 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f68d7958-92d2j"] Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.237579 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9"] Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.242051 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f66dc4f94-k8sh9"] Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.370353 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd"] Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.412848 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32656903-ca62-4c85-a677-906ecd1a31bb" path="/var/lib/kubelet/pods/32656903-ca62-4c85-a677-906ecd1a31bb/volumes" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.418004 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6158429c-14e0-4faf-b1c5-cbbf5b4ac70d" path="/var/lib/kubelet/pods/6158429c-14e0-4faf-b1c5-cbbf5b4ac70d/volumes" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.435295 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 15:28:33 crc kubenswrapper[4773]: I0121 15:28:33.845467 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 21 15:28:34 crc kubenswrapper[4773]: I0121 15:28:34.183422 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 21 15:28:34 crc kubenswrapper[4773]: I0121 15:28:34.195139 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" event={"ID":"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e","Type":"ContainerStarted","Data":"a0940c936ca2396a43816a80485dcbfc47e0c1d948026168ee11fff3a845962e"} Jan 21 15:28:34 crc kubenswrapper[4773]: I0121 15:28:34.195197 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" event={"ID":"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e","Type":"ContainerStarted","Data":"03daf6e78a62315d7249403b6db8e09ce9e569830c04100b490b5f7807e74ed9"} Jan 21 15:28:34 crc kubenswrapper[4773]: I0121 15:28:34.196953 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:34 crc kubenswrapper[4773]: I0121 15:28:34.201746 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 21 15:28:34 crc kubenswrapper[4773]: I0121 15:28:34.203405 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:28:34 crc kubenswrapper[4773]: I0121 15:28:34.216579 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" podStartSLOduration=3.216558841 podStartE2EDuration="3.216558841s" podCreationTimestamp="2026-01-21 15:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:34.212277247 +0000 UTC m=+279.136766899" watchObservedRunningTime="2026-01-21 15:28:34.216558841 +0000 UTC m=+279.141048463" Jan 21 15:28:34 crc kubenswrapper[4773]: I0121 15:28:34.964229 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.271051 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.399733 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w"] Jan 21 15:28:35 crc kubenswrapper[4773]: E0121 15:28:35.400233 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6158429c-14e0-4faf-b1c5-cbbf5b4ac70d" containerName="controller-manager" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.400361 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6158429c-14e0-4faf-b1c5-cbbf5b4ac70d" containerName="controller-manager" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.400600 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6158429c-14e0-4faf-b1c5-cbbf5b4ac70d" containerName="controller-manager" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.401141 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.407683 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.407713 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.407756 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.408383 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.408589 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.408614 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.419663 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w"] Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.425584 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.441514 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-proxy-ca-bundles\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.441576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78c41a63-e62f-4171-bf6f-5c19f862f009-serving-cert\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.441621 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c5hd\" (UniqueName: \"kubernetes.io/projected/78c41a63-e62f-4171-bf6f-5c19f862f009-kube-api-access-6c5hd\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.441647 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-config\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.441666 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-client-ca\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.470157 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.542621 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-proxy-ca-bundles\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.542714 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78c41a63-e62f-4171-bf6f-5c19f862f009-serving-cert\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.542749 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c5hd\" (UniqueName: \"kubernetes.io/projected/78c41a63-e62f-4171-bf6f-5c19f862f009-kube-api-access-6c5hd\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.542793 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-config\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.542820 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-client-ca\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.543922 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-proxy-ca-bundles\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.544123 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-client-ca\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.544505 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-config\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.553939 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78c41a63-e62f-4171-bf6f-5c19f862f009-serving-cert\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.565550 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c5hd\" (UniqueName: \"kubernetes.io/projected/78c41a63-e62f-4171-bf6f-5c19f862f009-kube-api-access-6c5hd\") pod \"controller-manager-5d4b59f8f9-hrn4w\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.719263 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:35 crc kubenswrapper[4773]: I0121 15:28:35.898786 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w"] Jan 21 15:28:35 crc kubenswrapper[4773]: W0121 15:28:35.907954 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78c41a63_e62f_4171_bf6f_5c19f862f009.slice/crio-f69650543922fce96435de2d8ddc5e9dc7aa2cc1a9bf53e125b330df049af612 WatchSource:0}: Error finding container f69650543922fce96435de2d8ddc5e9dc7aa2cc1a9bf53e125b330df049af612: Status 404 returned error can't find the container with id f69650543922fce96435de2d8ddc5e9dc7aa2cc1a9bf53e125b330df049af612 Jan 21 15:28:36 crc kubenswrapper[4773]: I0121 15:28:36.213423 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" event={"ID":"78c41a63-e62f-4171-bf6f-5c19f862f009","Type":"ContainerStarted","Data":"7c51192d75a3766de60ef56c6fa37200e5c62154ef0a027b99d4feda7b34d25b"} Jan 21 15:28:36 crc kubenswrapper[4773]: I0121 15:28:36.213861 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" event={"ID":"78c41a63-e62f-4171-bf6f-5c19f862f009","Type":"ContainerStarted","Data":"f69650543922fce96435de2d8ddc5e9dc7aa2cc1a9bf53e125b330df049af612"} Jan 21 15:28:36 crc kubenswrapper[4773]: I0121 15:28:36.323762 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" podStartSLOduration=5.323740898 podStartE2EDuration="5.323740898s" podCreationTimestamp="2026-01-21 15:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:28:36.321524913 +0000 UTC m=+281.246014535" watchObservedRunningTime="2026-01-21 15:28:36.323740898 +0000 UTC m=+281.248230520" Jan 21 15:28:36 crc kubenswrapper[4773]: I0121 15:28:36.590162 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 21 15:28:37 crc kubenswrapper[4773]: I0121 15:28:37.218358 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:37 crc kubenswrapper[4773]: I0121 15:28:37.223640 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:28:37 crc kubenswrapper[4773]: I0121 15:28:37.643387 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 21 15:28:38 crc kubenswrapper[4773]: I0121 15:28:38.364057 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 21 15:28:38 crc kubenswrapper[4773]: I0121 15:28:38.451340 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 21 15:28:38 crc kubenswrapper[4773]: I0121 15:28:38.572503 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 21 15:28:39 crc kubenswrapper[4773]: I0121 15:28:39.221217 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 21 15:28:55 crc kubenswrapper[4773]: I0121 15:28:55.229071 4773 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 21 15:29:16 crc kubenswrapper[4773]: I0121 15:29:16.976836 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6zffl"] Jan 21 15:29:16 crc kubenswrapper[4773]: I0121 15:29:16.980083 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:16 crc kubenswrapper[4773]: I0121 15:29:16.997416 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6zffl"] Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.099003 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f68dadac-0d91-4530-a8db-7fe64427a84e-registry-certificates\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.099121 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f68dadac-0d91-4530-a8db-7fe64427a84e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.099170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f68dadac-0d91-4530-a8db-7fe64427a84e-trusted-ca\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.099208 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f68dadac-0d91-4530-a8db-7fe64427a84e-registry-tls\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.099376 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f68dadac-0d91-4530-a8db-7fe64427a84e-bound-sa-token\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.099473 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.099544 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f68dadac-0d91-4530-a8db-7fe64427a84e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.099591 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq5p8\" (UniqueName: \"kubernetes.io/projected/f68dadac-0d91-4530-a8db-7fe64427a84e-kube-api-access-nq5p8\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.126310 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.200474 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f68dadac-0d91-4530-a8db-7fe64427a84e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.200537 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f68dadac-0d91-4530-a8db-7fe64427a84e-trusted-ca\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.200574 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f68dadac-0d91-4530-a8db-7fe64427a84e-registry-tls\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.200609 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f68dadac-0d91-4530-a8db-7fe64427a84e-bound-sa-token\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.200644 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f68dadac-0d91-4530-a8db-7fe64427a84e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.200675 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq5p8\" (UniqueName: \"kubernetes.io/projected/f68dadac-0d91-4530-a8db-7fe64427a84e-kube-api-access-nq5p8\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.200747 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f68dadac-0d91-4530-a8db-7fe64427a84e-registry-certificates\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.202020 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f68dadac-0d91-4530-a8db-7fe64427a84e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.202030 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f68dadac-0d91-4530-a8db-7fe64427a84e-trusted-ca\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.202195 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f68dadac-0d91-4530-a8db-7fe64427a84e-registry-certificates\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.208415 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f68dadac-0d91-4530-a8db-7fe64427a84e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.208588 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f68dadac-0d91-4530-a8db-7fe64427a84e-registry-tls\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.217938 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq5p8\" (UniqueName: \"kubernetes.io/projected/f68dadac-0d91-4530-a8db-7fe64427a84e-kube-api-access-nq5p8\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.217965 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f68dadac-0d91-4530-a8db-7fe64427a84e-bound-sa-token\") pod \"image-registry-66df7c8f76-6zffl\" (UID: \"f68dadac-0d91-4530-a8db-7fe64427a84e\") " pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.297836 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:17 crc kubenswrapper[4773]: I0121 15:29:17.693091 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6zffl"] Jan 21 15:29:18 crc kubenswrapper[4773]: I0121 15:29:18.452237 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" event={"ID":"f68dadac-0d91-4530-a8db-7fe64427a84e","Type":"ContainerStarted","Data":"c52715c25e7887486a9f0be656ff3adde71312294cef01429a1fb4b66a01e42a"} Jan 21 15:29:19 crc kubenswrapper[4773]: I0121 15:29:19.459493 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" event={"ID":"f68dadac-0d91-4530-a8db-7fe64427a84e","Type":"ContainerStarted","Data":"d80960333ef12155d4421dc2d791abce5e9fbe721f6fbcf51d2d8b47d0d7f0cf"} Jan 21 15:29:19 crc kubenswrapper[4773]: I0121 15:29:19.460498 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:19 crc kubenswrapper[4773]: I0121 15:29:19.468990 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w"] Jan 21 15:29:19 crc kubenswrapper[4773]: I0121 15:29:19.469478 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" podUID="78c41a63-e62f-4171-bf6f-5c19f862f009" containerName="controller-manager" containerID="cri-o://7c51192d75a3766de60ef56c6fa37200e5c62154ef0a027b99d4feda7b34d25b" gracePeriod=30 Jan 21 15:29:20 crc kubenswrapper[4773]: I0121 15:29:20.465517 4773 generic.go:334] "Generic (PLEG): container finished" podID="78c41a63-e62f-4171-bf6f-5c19f862f009" containerID="7c51192d75a3766de60ef56c6fa37200e5c62154ef0a027b99d4feda7b34d25b" exitCode=0 Jan 21 15:29:20 crc kubenswrapper[4773]: I0121 15:29:20.465605 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" event={"ID":"78c41a63-e62f-4171-bf6f-5c19f862f009","Type":"ContainerDied","Data":"7c51192d75a3766de60ef56c6fa37200e5c62154ef0a027b99d4feda7b34d25b"} Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.666379 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.688548 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" podStartSLOduration=5.6885266340000005 podStartE2EDuration="5.688526634s" podCreationTimestamp="2026-01-21 15:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:29:19.48304516 +0000 UTC m=+324.407534782" watchObservedRunningTime="2026-01-21 15:29:21.688526634 +0000 UTC m=+326.613016256" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.694108 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8595656658-qlhgt"] Jan 21 15:29:21 crc kubenswrapper[4773]: E0121 15:29:21.694325 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c41a63-e62f-4171-bf6f-5c19f862f009" containerName="controller-manager" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.694341 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c41a63-e62f-4171-bf6f-5c19f862f009" containerName="controller-manager" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.694450 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c41a63-e62f-4171-bf6f-5c19f862f009" containerName="controller-manager" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.694841 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.717718 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8595656658-qlhgt"] Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.763257 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-proxy-ca-bundles\") pod \"78c41a63-e62f-4171-bf6f-5c19f862f009\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.763377 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-client-ca\") pod \"78c41a63-e62f-4171-bf6f-5c19f862f009\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.763408 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c5hd\" (UniqueName: \"kubernetes.io/projected/78c41a63-e62f-4171-bf6f-5c19f862f009-kube-api-access-6c5hd\") pod \"78c41a63-e62f-4171-bf6f-5c19f862f009\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.763456 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78c41a63-e62f-4171-bf6f-5c19f862f009-serving-cert\") pod \"78c41a63-e62f-4171-bf6f-5c19f862f009\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.763490 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-config\") pod \"78c41a63-e62f-4171-bf6f-5c19f862f009\" (UID: \"78c41a63-e62f-4171-bf6f-5c19f862f009\") " Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.764210 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-client-ca" (OuterVolumeSpecName: "client-ca") pod "78c41a63-e62f-4171-bf6f-5c19f862f009" (UID: "78c41a63-e62f-4171-bf6f-5c19f862f009"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.764221 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "78c41a63-e62f-4171-bf6f-5c19f862f009" (UID: "78c41a63-e62f-4171-bf6f-5c19f862f009"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.764314 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-config" (OuterVolumeSpecName: "config") pod "78c41a63-e62f-4171-bf6f-5c19f862f009" (UID: "78c41a63-e62f-4171-bf6f-5c19f862f009"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.768478 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c41a63-e62f-4171-bf6f-5c19f862f009-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "78c41a63-e62f-4171-bf6f-5c19f862f009" (UID: "78c41a63-e62f-4171-bf6f-5c19f862f009"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.768649 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c41a63-e62f-4171-bf6f-5c19f862f009-kube-api-access-6c5hd" (OuterVolumeSpecName: "kube-api-access-6c5hd") pod "78c41a63-e62f-4171-bf6f-5c19f862f009" (UID: "78c41a63-e62f-4171-bf6f-5c19f862f009"). InnerVolumeSpecName "kube-api-access-6c5hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.865467 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3aee91c1-d447-4c46-ae5e-94460c8857ec-proxy-ca-bundles\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.865872 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aee91c1-d447-4c46-ae5e-94460c8857ec-config\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.866095 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3aee91c1-d447-4c46-ae5e-94460c8857ec-client-ca\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.866199 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xzzt\" (UniqueName: \"kubernetes.io/projected/3aee91c1-d447-4c46-ae5e-94460c8857ec-kube-api-access-9xzzt\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.866228 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3aee91c1-d447-4c46-ae5e-94460c8857ec-serving-cert\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.866395 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.866487 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c5hd\" (UniqueName: \"kubernetes.io/projected/78c41a63-e62f-4171-bf6f-5c19f862f009-kube-api-access-6c5hd\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.866561 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78c41a63-e62f-4171-bf6f-5c19f862f009-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.866634 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.866744 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78c41a63-e62f-4171-bf6f-5c19f862f009-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.967817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3aee91c1-d447-4c46-ae5e-94460c8857ec-proxy-ca-bundles\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.968235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aee91c1-d447-4c46-ae5e-94460c8857ec-config\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.968366 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3aee91c1-d447-4c46-ae5e-94460c8857ec-client-ca\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.968514 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xzzt\" (UniqueName: \"kubernetes.io/projected/3aee91c1-d447-4c46-ae5e-94460c8857ec-kube-api-access-9xzzt\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.968633 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3aee91c1-d447-4c46-ae5e-94460c8857ec-serving-cert\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.969076 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3aee91c1-d447-4c46-ae5e-94460c8857ec-proxy-ca-bundles\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.969101 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3aee91c1-d447-4c46-ae5e-94460c8857ec-client-ca\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.969476 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aee91c1-d447-4c46-ae5e-94460c8857ec-config\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.972427 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3aee91c1-d447-4c46-ae5e-94460c8857ec-serving-cert\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:21 crc kubenswrapper[4773]: I0121 15:29:21.986437 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xzzt\" (UniqueName: \"kubernetes.io/projected/3aee91c1-d447-4c46-ae5e-94460c8857ec-kube-api-access-9xzzt\") pod \"controller-manager-8595656658-qlhgt\" (UID: \"3aee91c1-d447-4c46-ae5e-94460c8857ec\") " pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.025676 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.204744 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8595656658-qlhgt"] Jan 21 15:29:22 crc kubenswrapper[4773]: W0121 15:29:22.210551 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aee91c1_d447_4c46_ae5e_94460c8857ec.slice/crio-8186e437edd2991ac6385c293dd9ff942c8586d41a098a6da839cb151275de60 WatchSource:0}: Error finding container 8186e437edd2991ac6385c293dd9ff942c8586d41a098a6da839cb151275de60: Status 404 returned error can't find the container with id 8186e437edd2991ac6385c293dd9ff942c8586d41a098a6da839cb151275de60 Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.481170 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" event={"ID":"3aee91c1-d447-4c46-ae5e-94460c8857ec","Type":"ContainerStarted","Data":"003ae1bfff898017effa64edd295b67c7bb8ffdaa40c87152c96c9947c10879b"} Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.481223 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" event={"ID":"3aee91c1-d447-4c46-ae5e-94460c8857ec","Type":"ContainerStarted","Data":"8186e437edd2991ac6385c293dd9ff942c8586d41a098a6da839cb151275de60"} Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.482494 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.484516 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" event={"ID":"78c41a63-e62f-4171-bf6f-5c19f862f009","Type":"ContainerDied","Data":"f69650543922fce96435de2d8ddc5e9dc7aa2cc1a9bf53e125b330df049af612"} Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.484560 4773 scope.go:117] "RemoveContainer" containerID="7c51192d75a3766de60ef56c6fa37200e5c62154ef0a027b99d4feda7b34d25b" Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.484584 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w" Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.490638 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.523873 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8595656658-qlhgt" podStartSLOduration=3.523855968 podStartE2EDuration="3.523855968s" podCreationTimestamp="2026-01-21 15:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:29:22.503945151 +0000 UTC m=+327.428434773" watchObservedRunningTime="2026-01-21 15:29:22.523855968 +0000 UTC m=+327.448345600" Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.539484 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w"] Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.542432 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d4b59f8f9-hrn4w"] Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.955365 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fzdjk"] Jan 21 15:29:22 crc kubenswrapper[4773]: I0121 15:29:22.955975 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fzdjk" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerName="registry-server" containerID="cri-o://04090fcad257a928f2292803325e4773eaae1c9ab1b3f26f947e863da21bbd31" gracePeriod=2 Jan 21 15:29:23 crc kubenswrapper[4773]: I0121 15:29:23.394589 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c41a63-e62f-4171-bf6f-5c19f862f009" path="/var/lib/kubelet/pods/78c41a63-e62f-4171-bf6f-5c19f862f009/volumes" Jan 21 15:29:25 crc kubenswrapper[4773]: I0121 15:29:25.165202 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ldb72"] Jan 21 15:29:25 crc kubenswrapper[4773]: I0121 15:29:25.165560 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ldb72" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" containerName="registry-server" containerID="cri-o://73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5" gracePeriod=2 Jan 21 15:29:25 crc kubenswrapper[4773]: I0121 15:29:25.356014 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvqn8"] Jan 21 15:29:25 crc kubenswrapper[4773]: I0121 15:29:25.356569 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wvqn8" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerName="registry-server" containerID="cri-o://e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266" gracePeriod=2 Jan 21 15:29:27 crc kubenswrapper[4773]: I0121 15:29:27.540376 4773 generic.go:334] "Generic (PLEG): container finished" podID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerID="04090fcad257a928f2292803325e4773eaae1c9ab1b3f26f947e863da21bbd31" exitCode=0 Jan 21 15:29:27 crc kubenswrapper[4773]: I0121 15:29:27.540462 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzdjk" event={"ID":"9289cbf3-df59-4b6d-890f-d213b42bd96b","Type":"ContainerDied","Data":"04090fcad257a928f2292803325e4773eaae1c9ab1b3f26f947e863da21bbd31"} Jan 21 15:29:27 crc kubenswrapper[4773]: I0121 15:29:27.868074 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.017116 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ldb72_378f5d0d-bcef-44be-b03a-b29b3ea33329/registry-server/0.log" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.017872 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.052639 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-utilities\") pod \"9289cbf3-df59-4b6d-890f-d213b42bd96b\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.052757 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld2fc\" (UniqueName: \"kubernetes.io/projected/9289cbf3-df59-4b6d-890f-d213b42bd96b-kube-api-access-ld2fc\") pod \"9289cbf3-df59-4b6d-890f-d213b42bd96b\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.052789 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-catalog-content\") pod \"9289cbf3-df59-4b6d-890f-d213b42bd96b\" (UID: \"9289cbf3-df59-4b6d-890f-d213b42bd96b\") " Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.053335 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-utilities" (OuterVolumeSpecName: "utilities") pod "9289cbf3-df59-4b6d-890f-d213b42bd96b" (UID: "9289cbf3-df59-4b6d-890f-d213b42bd96b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.056017 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.058118 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9289cbf3-df59-4b6d-890f-d213b42bd96b-kube-api-access-ld2fc" (OuterVolumeSpecName: "kube-api-access-ld2fc") pod "9289cbf3-df59-4b6d-890f-d213b42bd96b" (UID: "9289cbf3-df59-4b6d-890f-d213b42bd96b"). InnerVolumeSpecName "kube-api-access-ld2fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.112401 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9289cbf3-df59-4b6d-890f-d213b42bd96b" (UID: "9289cbf3-df59-4b6d-890f-d213b42bd96b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.156575 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-catalog-content\") pod \"378f5d0d-bcef-44be-b03a-b29b3ea33329\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.156979 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvhn6\" (UniqueName: \"kubernetes.io/projected/378f5d0d-bcef-44be-b03a-b29b3ea33329-kube-api-access-zvhn6\") pod \"378f5d0d-bcef-44be-b03a-b29b3ea33329\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.157146 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-utilities\") pod \"378f5d0d-bcef-44be-b03a-b29b3ea33329\" (UID: \"378f5d0d-bcef-44be-b03a-b29b3ea33329\") " Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.158241 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-utilities" (OuterVolumeSpecName: "utilities") pod "378f5d0d-bcef-44be-b03a-b29b3ea33329" (UID: "378f5d0d-bcef-44be-b03a-b29b3ea33329"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.158608 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.158630 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld2fc\" (UniqueName: \"kubernetes.io/projected/9289cbf3-df59-4b6d-890f-d213b42bd96b-kube-api-access-ld2fc\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.158641 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9289cbf3-df59-4b6d-890f-d213b42bd96b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.159883 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/378f5d0d-bcef-44be-b03a-b29b3ea33329-kube-api-access-zvhn6" (OuterVolumeSpecName: "kube-api-access-zvhn6") pod "378f5d0d-bcef-44be-b03a-b29b3ea33329" (UID: "378f5d0d-bcef-44be-b03a-b29b3ea33329"). InnerVolumeSpecName "kube-api-access-zvhn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.181131 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "378f5d0d-bcef-44be-b03a-b29b3ea33329" (UID: "378f5d0d-bcef-44be-b03a-b29b3ea33329"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.259878 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/378f5d0d-bcef-44be-b03a-b29b3ea33329-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.259922 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvhn6\" (UniqueName: \"kubernetes.io/projected/378f5d0d-bcef-44be-b03a-b29b3ea33329-kube-api-access-zvhn6\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.423291 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wvqn8_46dfa17b-ab20-4d55-933d-2ee7869977c5/registry-server/0.log" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.424162 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.549181 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzdjk" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.549308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzdjk" event={"ID":"9289cbf3-df59-4b6d-890f-d213b42bd96b","Type":"ContainerDied","Data":"1f4dd542a9801bb4fa4a63fd8c9f28d461953eadfe7cc8718e8d5582b5faf890"} Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.549373 4773 scope.go:117] "RemoveContainer" containerID="04090fcad257a928f2292803325e4773eaae1c9ab1b3f26f947e863da21bbd31" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.552831 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wvqn8_46dfa17b-ab20-4d55-933d-2ee7869977c5/registry-server/0.log" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.553652 4773 generic.go:334] "Generic (PLEG): container finished" podID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerID="e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266" exitCode=137 Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.553729 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvqn8" event={"ID":"46dfa17b-ab20-4d55-933d-2ee7869977c5","Type":"ContainerDied","Data":"e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266"} Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.553761 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvqn8" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.553767 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvqn8" event={"ID":"46dfa17b-ab20-4d55-933d-2ee7869977c5","Type":"ContainerDied","Data":"2f31ee880ccbdbfae41beea24f14893eded9ed9787878db1b1d3b029a927879f"} Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.556906 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ldb72_378f5d0d-bcef-44be-b03a-b29b3ea33329/registry-server/0.log" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.557668 4773 generic.go:334] "Generic (PLEG): container finished" podID="378f5d0d-bcef-44be-b03a-b29b3ea33329" containerID="73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5" exitCode=137 Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.557722 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ldb72" event={"ID":"378f5d0d-bcef-44be-b03a-b29b3ea33329","Type":"ContainerDied","Data":"73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5"} Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.557742 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ldb72" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.557759 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ldb72" event={"ID":"378f5d0d-bcef-44be-b03a-b29b3ea33329","Type":"ContainerDied","Data":"a682892a8e40069cdb7386b6d507533a9f48687b210e4af8c3fb8163ade6850c"} Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.563381 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-utilities\") pod \"46dfa17b-ab20-4d55-933d-2ee7869977c5\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.563582 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78hmf\" (UniqueName: \"kubernetes.io/projected/46dfa17b-ab20-4d55-933d-2ee7869977c5-kube-api-access-78hmf\") pod \"46dfa17b-ab20-4d55-933d-2ee7869977c5\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.563679 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-catalog-content\") pod \"46dfa17b-ab20-4d55-933d-2ee7869977c5\" (UID: \"46dfa17b-ab20-4d55-933d-2ee7869977c5\") " Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.567267 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-utilities" (OuterVolumeSpecName: "utilities") pod "46dfa17b-ab20-4d55-933d-2ee7869977c5" (UID: "46dfa17b-ab20-4d55-933d-2ee7869977c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.568585 4773 scope.go:117] "RemoveContainer" containerID="c2969a28cec03da3b079363ccda7ded0b3d5fef3e3d3e3ba8aa2c766f1867399" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.571177 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46dfa17b-ab20-4d55-933d-2ee7869977c5-kube-api-access-78hmf" (OuterVolumeSpecName: "kube-api-access-78hmf") pod "46dfa17b-ab20-4d55-933d-2ee7869977c5" (UID: "46dfa17b-ab20-4d55-933d-2ee7869977c5"). InnerVolumeSpecName "kube-api-access-78hmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.585836 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fzdjk"] Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.593036 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fzdjk"] Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.618801 4773 scope.go:117] "RemoveContainer" containerID="0fe1baca2236af71ff20ae6c95d0231e4a9fe2967cae31c8c4f01a1aa79c8665" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.627433 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ldb72"] Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.633322 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ldb72"] Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.638148 4773 scope.go:117] "RemoveContainer" containerID="e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.650752 4773 scope.go:117] "RemoveContainer" containerID="d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.665187 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.665447 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78hmf\" (UniqueName: \"kubernetes.io/projected/46dfa17b-ab20-4d55-933d-2ee7869977c5-kube-api-access-78hmf\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.667833 4773 scope.go:117] "RemoveContainer" containerID="2cd69b88a9c5eeb915c0250c7ed5792c5f09d12f8871f2015a7843dcf23434d4" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.680995 4773 scope.go:117] "RemoveContainer" containerID="e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266" Jan 21 15:29:28 crc kubenswrapper[4773]: E0121 15:29:28.681540 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266\": container with ID starting with e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266 not found: ID does not exist" containerID="e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.681573 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266"} err="failed to get container status \"e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266\": rpc error: code = NotFound desc = could not find container \"e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266\": container with ID starting with e23fd25a79a0dac885de857d23a1e2fb3e1f460bc4a46463e1c86f90157ff266 not found: ID does not exist" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.681599 4773 scope.go:117] "RemoveContainer" containerID="d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d" Jan 21 15:29:28 crc kubenswrapper[4773]: E0121 15:29:28.681954 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d\": container with ID starting with d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d not found: ID does not exist" containerID="d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.681981 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d"} err="failed to get container status \"d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d\": rpc error: code = NotFound desc = could not find container \"d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d\": container with ID starting with d4584bfb5cc8e7773e2d3955e51ecf0de51512217745f2ddf9127925cedf4c6d not found: ID does not exist" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.682000 4773 scope.go:117] "RemoveContainer" containerID="2cd69b88a9c5eeb915c0250c7ed5792c5f09d12f8871f2015a7843dcf23434d4" Jan 21 15:29:28 crc kubenswrapper[4773]: E0121 15:29:28.682375 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd69b88a9c5eeb915c0250c7ed5792c5f09d12f8871f2015a7843dcf23434d4\": container with ID starting with 2cd69b88a9c5eeb915c0250c7ed5792c5f09d12f8871f2015a7843dcf23434d4 not found: ID does not exist" containerID="2cd69b88a9c5eeb915c0250c7ed5792c5f09d12f8871f2015a7843dcf23434d4" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.682397 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd69b88a9c5eeb915c0250c7ed5792c5f09d12f8871f2015a7843dcf23434d4"} err="failed to get container status \"2cd69b88a9c5eeb915c0250c7ed5792c5f09d12f8871f2015a7843dcf23434d4\": rpc error: code = NotFound desc = could not find container \"2cd69b88a9c5eeb915c0250c7ed5792c5f09d12f8871f2015a7843dcf23434d4\": container with ID starting with 2cd69b88a9c5eeb915c0250c7ed5792c5f09d12f8871f2015a7843dcf23434d4 not found: ID does not exist" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.682412 4773 scope.go:117] "RemoveContainer" containerID="73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.698162 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46dfa17b-ab20-4d55-933d-2ee7869977c5" (UID: "46dfa17b-ab20-4d55-933d-2ee7869977c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.698519 4773 scope.go:117] "RemoveContainer" containerID="dfa20da59eb5a4cdbeb2d688cbfb842ff60040d706ecb6dfd1ec56cdeca51fab" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.712413 4773 scope.go:117] "RemoveContainer" containerID="5a88ffe167296f03d3bda0db85b1ec5f40dc8a9e941085efc506a662abaefc3d" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.725264 4773 scope.go:117] "RemoveContainer" containerID="73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5" Jan 21 15:29:28 crc kubenswrapper[4773]: E0121 15:29:28.726147 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5\": container with ID starting with 73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5 not found: ID does not exist" containerID="73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.726191 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5"} err="failed to get container status \"73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5\": rpc error: code = NotFound desc = could not find container \"73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5\": container with ID starting with 73ca423ada7ed9916994b38bc0e9a11680f1a7259492a2ce3978b8431bf168d5 not found: ID does not exist" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.726221 4773 scope.go:117] "RemoveContainer" containerID="dfa20da59eb5a4cdbeb2d688cbfb842ff60040d706ecb6dfd1ec56cdeca51fab" Jan 21 15:29:28 crc kubenswrapper[4773]: E0121 15:29:28.726764 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfa20da59eb5a4cdbeb2d688cbfb842ff60040d706ecb6dfd1ec56cdeca51fab\": container with ID starting with dfa20da59eb5a4cdbeb2d688cbfb842ff60040d706ecb6dfd1ec56cdeca51fab not found: ID does not exist" containerID="dfa20da59eb5a4cdbeb2d688cbfb842ff60040d706ecb6dfd1ec56cdeca51fab" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.726793 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfa20da59eb5a4cdbeb2d688cbfb842ff60040d706ecb6dfd1ec56cdeca51fab"} err="failed to get container status \"dfa20da59eb5a4cdbeb2d688cbfb842ff60040d706ecb6dfd1ec56cdeca51fab\": rpc error: code = NotFound desc = could not find container \"dfa20da59eb5a4cdbeb2d688cbfb842ff60040d706ecb6dfd1ec56cdeca51fab\": container with ID starting with dfa20da59eb5a4cdbeb2d688cbfb842ff60040d706ecb6dfd1ec56cdeca51fab not found: ID does not exist" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.726829 4773 scope.go:117] "RemoveContainer" containerID="5a88ffe167296f03d3bda0db85b1ec5f40dc8a9e941085efc506a662abaefc3d" Jan 21 15:29:28 crc kubenswrapper[4773]: E0121 15:29:28.727132 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a88ffe167296f03d3bda0db85b1ec5f40dc8a9e941085efc506a662abaefc3d\": container with ID starting with 5a88ffe167296f03d3bda0db85b1ec5f40dc8a9e941085efc506a662abaefc3d not found: ID does not exist" containerID="5a88ffe167296f03d3bda0db85b1ec5f40dc8a9e941085efc506a662abaefc3d" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.727244 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a88ffe167296f03d3bda0db85b1ec5f40dc8a9e941085efc506a662abaefc3d"} err="failed to get container status \"5a88ffe167296f03d3bda0db85b1ec5f40dc8a9e941085efc506a662abaefc3d\": rpc error: code = NotFound desc = could not find container \"5a88ffe167296f03d3bda0db85b1ec5f40dc8a9e941085efc506a662abaefc3d\": container with ID starting with 5a88ffe167296f03d3bda0db85b1ec5f40dc8a9e941085efc506a662abaefc3d not found: ID does not exist" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.766554 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46dfa17b-ab20-4d55-933d-2ee7869977c5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.881651 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvqn8"] Jan 21 15:29:28 crc kubenswrapper[4773]: I0121 15:29:28.885765 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wvqn8"] Jan 21 15:29:29 crc kubenswrapper[4773]: I0121 15:29:29.391235 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" path="/var/lib/kubelet/pods/378f5d0d-bcef-44be-b03a-b29b3ea33329/volumes" Jan 21 15:29:29 crc kubenswrapper[4773]: I0121 15:29:29.392434 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" path="/var/lib/kubelet/pods/46dfa17b-ab20-4d55-933d-2ee7869977c5/volumes" Jan 21 15:29:29 crc kubenswrapper[4773]: I0121 15:29:29.393123 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" path="/var/lib/kubelet/pods/9289cbf3-df59-4b6d-890f-d213b42bd96b/volumes" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.287276 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzncg"] Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.288354 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vzncg" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" containerName="registry-server" containerID="cri-o://097525dd10682aeca3047dca7b54c9fb439f55fd91ecfccade5af1c0d5bba740" gracePeriod=30 Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.298680 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x4rrg"] Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.300596 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x4rrg" podUID="d36b150f-af27-41a9-b699-db2207d44d58" containerName="registry-server" containerID="cri-o://dcaac8f82a51952581bd3447fde7d0a516825e2a2c10803ac3a86db0330d1e87" gracePeriod=30 Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.310974 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zpcds"] Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.311437 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" podUID="262910dc-030f-4767-833a-507c1a280963" containerName="marketplace-operator" containerID="cri-o://ef3bd6511f8e7459721d629cd0705d7ca2707cdde18393d9f251bb354eca849d" gracePeriod=30 Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.317564 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-6zffl" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.323778 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qr2xs"] Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.324135 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qr2xs" podUID="f28ee43e-c39a-4033-a36b-01a987f6c85e" containerName="registry-server" containerID="cri-o://0c83ca894a5ce65db4656e60e29cd02919a83380cfc445f2497d6d6e118555d8" gracePeriod=30 Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.328776 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6mw7"] Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.329042 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z6mw7" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerName="registry-server" containerID="cri-o://f8c006a805e23dee902281718a6f4a5aef1e55c4b4a3c290340eefefe35ebf51" gracePeriod=30 Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.332875 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hxs87"] Jan 21 15:29:37 crc kubenswrapper[4773]: E0121 15:29:37.333156 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerName="extract-utilities" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333169 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerName="extract-utilities" Jan 21 15:29:37 crc kubenswrapper[4773]: E0121 15:29:37.333182 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerName="extract-utilities" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333190 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerName="extract-utilities" Jan 21 15:29:37 crc kubenswrapper[4773]: E0121 15:29:37.333210 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" containerName="registry-server" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333217 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" containerName="registry-server" Jan 21 15:29:37 crc kubenswrapper[4773]: E0121 15:29:37.333231 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" containerName="extract-utilities" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333240 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" containerName="extract-utilities" Jan 21 15:29:37 crc kubenswrapper[4773]: E0121 15:29:37.333249 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerName="extract-content" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333258 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerName="extract-content" Jan 21 15:29:37 crc kubenswrapper[4773]: E0121 15:29:37.333270 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerName="extract-content" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333277 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerName="extract-content" Jan 21 15:29:37 crc kubenswrapper[4773]: E0121 15:29:37.333288 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerName="registry-server" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333295 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerName="registry-server" Jan 21 15:29:37 crc kubenswrapper[4773]: E0121 15:29:37.333307 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" containerName="extract-content" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333315 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" containerName="extract-content" Jan 21 15:29:37 crc kubenswrapper[4773]: E0121 15:29:37.333323 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerName="registry-server" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333330 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerName="registry-server" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333449 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="46dfa17b-ab20-4d55-933d-2ee7869977c5" containerName="registry-server" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333465 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9289cbf3-df59-4b6d-890f-d213b42bd96b" containerName="registry-server" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.333475 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="378f5d0d-bcef-44be-b03a-b29b3ea33329" containerName="registry-server" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.334139 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.337358 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hxs87"] Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.404338 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cb2vm"] Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.501646 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f51f220a-c9d3-4bb3-938a-72ab3ae24ee7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hxs87\" (UID: \"f51f220a-c9d3-4bb3-938a-72ab3ae24ee7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.501840 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f51f220a-c9d3-4bb3-938a-72ab3ae24ee7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hxs87\" (UID: \"f51f220a-c9d3-4bb3-938a-72ab3ae24ee7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.501929 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smjrk\" (UniqueName: \"kubernetes.io/projected/f51f220a-c9d3-4bb3-938a-72ab3ae24ee7-kube-api-access-smjrk\") pod \"marketplace-operator-79b997595-hxs87\" (UID: \"f51f220a-c9d3-4bb3-938a-72ab3ae24ee7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.603912 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f51f220a-c9d3-4bb3-938a-72ab3ae24ee7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hxs87\" (UID: \"f51f220a-c9d3-4bb3-938a-72ab3ae24ee7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.604005 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smjrk\" (UniqueName: \"kubernetes.io/projected/f51f220a-c9d3-4bb3-938a-72ab3ae24ee7-kube-api-access-smjrk\") pod \"marketplace-operator-79b997595-hxs87\" (UID: \"f51f220a-c9d3-4bb3-938a-72ab3ae24ee7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.604043 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f51f220a-c9d3-4bb3-938a-72ab3ae24ee7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hxs87\" (UID: \"f51f220a-c9d3-4bb3-938a-72ab3ae24ee7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.605376 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f51f220a-c9d3-4bb3-938a-72ab3ae24ee7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hxs87\" (UID: \"f51f220a-c9d3-4bb3-938a-72ab3ae24ee7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.618496 4773 generic.go:334] "Generic (PLEG): container finished" podID="f28ee43e-c39a-4033-a36b-01a987f6c85e" containerID="0c83ca894a5ce65db4656e60e29cd02919a83380cfc445f2497d6d6e118555d8" exitCode=0 Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.618625 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qr2xs" event={"ID":"f28ee43e-c39a-4033-a36b-01a987f6c85e","Type":"ContainerDied","Data":"0c83ca894a5ce65db4656e60e29cd02919a83380cfc445f2497d6d6e118555d8"} Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.620840 4773 generic.go:334] "Generic (PLEG): container finished" podID="262910dc-030f-4767-833a-507c1a280963" containerID="ef3bd6511f8e7459721d629cd0705d7ca2707cdde18393d9f251bb354eca849d" exitCode=0 Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.620930 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" event={"ID":"262910dc-030f-4767-833a-507c1a280963","Type":"ContainerDied","Data":"ef3bd6511f8e7459721d629cd0705d7ca2707cdde18393d9f251bb354eca849d"} Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.621503 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f51f220a-c9d3-4bb3-938a-72ab3ae24ee7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hxs87\" (UID: \"f51f220a-c9d3-4bb3-938a-72ab3ae24ee7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.624089 4773 generic.go:334] "Generic (PLEG): container finished" podID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" containerID="097525dd10682aeca3047dca7b54c9fb439f55fd91ecfccade5af1c0d5bba740" exitCode=0 Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.624154 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzncg" event={"ID":"3d6badc3-8b6a-4308-84b9-a6a1d6460878","Type":"ContainerDied","Data":"097525dd10682aeca3047dca7b54c9fb439f55fd91ecfccade5af1c0d5bba740"} Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.624738 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smjrk\" (UniqueName: \"kubernetes.io/projected/f51f220a-c9d3-4bb3-938a-72ab3ae24ee7-kube-api-access-smjrk\") pod \"marketplace-operator-79b997595-hxs87\" (UID: \"f51f220a-c9d3-4bb3-938a-72ab3ae24ee7\") " pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.627004 4773 generic.go:334] "Generic (PLEG): container finished" podID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerID="f8c006a805e23dee902281718a6f4a5aef1e55c4b4a3c290340eefefe35ebf51" exitCode=0 Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.627078 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mw7" event={"ID":"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad","Type":"ContainerDied","Data":"f8c006a805e23dee902281718a6f4a5aef1e55c4b4a3c290340eefefe35ebf51"} Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.629508 4773 generic.go:334] "Generic (PLEG): container finished" podID="d36b150f-af27-41a9-b699-db2207d44d58" containerID="dcaac8f82a51952581bd3447fde7d0a516825e2a2c10803ac3a86db0330d1e87" exitCode=0 Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.629541 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4rrg" event={"ID":"d36b150f-af27-41a9-b699-db2207d44d58","Type":"ContainerDied","Data":"dcaac8f82a51952581bd3447fde7d0a516825e2a2c10803ac3a86db0330d1e87"} Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.662612 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:37 crc kubenswrapper[4773]: I0121 15:29:37.863787 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.008125 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-utilities\") pod \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.008171 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-catalog-content\") pod \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.008219 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsflj\" (UniqueName: \"kubernetes.io/projected/3d6badc3-8b6a-4308-84b9-a6a1d6460878-kube-api-access-rsflj\") pod \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\" (UID: \"3d6badc3-8b6a-4308-84b9-a6a1d6460878\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.009213 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-utilities" (OuterVolumeSpecName: "utilities") pod "3d6badc3-8b6a-4308-84b9-a6a1d6460878" (UID: "3d6badc3-8b6a-4308-84b9-a6a1d6460878"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.013922 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6badc3-8b6a-4308-84b9-a6a1d6460878-kube-api-access-rsflj" (OuterVolumeSpecName: "kube-api-access-rsflj") pod "3d6badc3-8b6a-4308-84b9-a6a1d6460878" (UID: "3d6badc3-8b6a-4308-84b9-a6a1d6460878"). InnerVolumeSpecName "kube-api-access-rsflj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.075169 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d6badc3-8b6a-4308-84b9-a6a1d6460878" (UID: "3d6badc3-8b6a-4308-84b9-a6a1d6460878"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.083049 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.098469 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.110412 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.110451 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6badc3-8b6a-4308-84b9-a6a1d6460878-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.110464 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsflj\" (UniqueName: \"kubernetes.io/projected/3d6badc3-8b6a-4308-84b9-a6a1d6460878-kube-api-access-rsflj\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.151881 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.157357 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.214424 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-utilities\") pod \"f28ee43e-c39a-4033-a36b-01a987f6c85e\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.214541 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-catalog-content\") pod \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.214594 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-catalog-content\") pod \"f28ee43e-c39a-4033-a36b-01a987f6c85e\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.214761 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr2sg\" (UniqueName: \"kubernetes.io/projected/f28ee43e-c39a-4033-a36b-01a987f6c85e-kube-api-access-xr2sg\") pod \"f28ee43e-c39a-4033-a36b-01a987f6c85e\" (UID: \"f28ee43e-c39a-4033-a36b-01a987f6c85e\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.214822 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7cmz\" (UniqueName: \"kubernetes.io/projected/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-kube-api-access-r7cmz\") pod \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.214862 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-utilities\") pod \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\" (UID: \"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.215373 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-utilities" (OuterVolumeSpecName: "utilities") pod "f28ee43e-c39a-4033-a36b-01a987f6c85e" (UID: "f28ee43e-c39a-4033-a36b-01a987f6c85e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.216271 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-utilities" (OuterVolumeSpecName: "utilities") pod "0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" (UID: "0cfe48f4-51ac-4a11-9dd3-b6087995d9ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.220230 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f28ee43e-c39a-4033-a36b-01a987f6c85e-kube-api-access-xr2sg" (OuterVolumeSpecName: "kube-api-access-xr2sg") pod "f28ee43e-c39a-4033-a36b-01a987f6c85e" (UID: "f28ee43e-c39a-4033-a36b-01a987f6c85e"). InnerVolumeSpecName "kube-api-access-xr2sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.220575 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-kube-api-access-r7cmz" (OuterVolumeSpecName: "kube-api-access-r7cmz") pod "0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" (UID: "0cfe48f4-51ac-4a11-9dd3-b6087995d9ad"). InnerVolumeSpecName "kube-api-access-r7cmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.244851 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f28ee43e-c39a-4033-a36b-01a987f6c85e" (UID: "f28ee43e-c39a-4033-a36b-01a987f6c85e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.316632 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/262910dc-030f-4767-833a-507c1a280963-marketplace-trusted-ca\") pod \"262910dc-030f-4767-833a-507c1a280963\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.316714 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rptdp\" (UniqueName: \"kubernetes.io/projected/d36b150f-af27-41a9-b699-db2207d44d58-kube-api-access-rptdp\") pod \"d36b150f-af27-41a9-b699-db2207d44d58\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.316792 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-utilities\") pod \"d36b150f-af27-41a9-b699-db2207d44d58\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.316829 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/262910dc-030f-4767-833a-507c1a280963-marketplace-operator-metrics\") pod \"262910dc-030f-4767-833a-507c1a280963\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.316878 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-catalog-content\") pod \"d36b150f-af27-41a9-b699-db2207d44d58\" (UID: \"d36b150f-af27-41a9-b699-db2207d44d58\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.316928 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hw9b\" (UniqueName: \"kubernetes.io/projected/262910dc-030f-4767-833a-507c1a280963-kube-api-access-6hw9b\") pod \"262910dc-030f-4767-833a-507c1a280963\" (UID: \"262910dc-030f-4767-833a-507c1a280963\") " Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.317140 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7cmz\" (UniqueName: \"kubernetes.io/projected/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-kube-api-access-r7cmz\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.317153 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.317161 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.317172 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f28ee43e-c39a-4033-a36b-01a987f6c85e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.317183 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr2sg\" (UniqueName: \"kubernetes.io/projected/f28ee43e-c39a-4033-a36b-01a987f6c85e-kube-api-access-xr2sg\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.318649 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/262910dc-030f-4767-833a-507c1a280963-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "262910dc-030f-4767-833a-507c1a280963" (UID: "262910dc-030f-4767-833a-507c1a280963"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.318803 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-utilities" (OuterVolumeSpecName: "utilities") pod "d36b150f-af27-41a9-b699-db2207d44d58" (UID: "d36b150f-af27-41a9-b699-db2207d44d58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.320887 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/262910dc-030f-4767-833a-507c1a280963-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "262910dc-030f-4767-833a-507c1a280963" (UID: "262910dc-030f-4767-833a-507c1a280963"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.321666 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/262910dc-030f-4767-833a-507c1a280963-kube-api-access-6hw9b" (OuterVolumeSpecName: "kube-api-access-6hw9b") pod "262910dc-030f-4767-833a-507c1a280963" (UID: "262910dc-030f-4767-833a-507c1a280963"). InnerVolumeSpecName "kube-api-access-6hw9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.322026 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36b150f-af27-41a9-b699-db2207d44d58-kube-api-access-rptdp" (OuterVolumeSpecName: "kube-api-access-rptdp") pod "d36b150f-af27-41a9-b699-db2207d44d58" (UID: "d36b150f-af27-41a9-b699-db2207d44d58"). InnerVolumeSpecName "kube-api-access-rptdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.356087 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hxs87"] Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.359274 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" (UID: "0cfe48f4-51ac-4a11-9dd3-b6087995d9ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.390927 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d36b150f-af27-41a9-b699-db2207d44d58" (UID: "d36b150f-af27-41a9-b699-db2207d44d58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.418274 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.418310 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/262910dc-030f-4767-833a-507c1a280963-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.418322 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.418331 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d36b150f-af27-41a9-b699-db2207d44d58-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.418339 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hw9b\" (UniqueName: \"kubernetes.io/projected/262910dc-030f-4767-833a-507c1a280963-kube-api-access-6hw9b\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.418347 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/262910dc-030f-4767-833a-507c1a280963-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.418357 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rptdp\" (UniqueName: \"kubernetes.io/projected/d36b150f-af27-41a9-b699-db2207d44d58-kube-api-access-rptdp\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.637086 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" event={"ID":"f51f220a-c9d3-4bb3-938a-72ab3ae24ee7","Type":"ContainerStarted","Data":"ed9e4804a6487552ff99dec5c880210d76e7a32c2ebdfc022af5ecb28c8e7e8e"} Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.637155 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" event={"ID":"f51f220a-c9d3-4bb3-938a-72ab3ae24ee7","Type":"ContainerStarted","Data":"457953b88d24752b536a2b1e6037bf16f0819c16d8f65432a525ed303c05cda6"} Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.637766 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.640532 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hxs87 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.64:8080/healthz\": dial tcp 10.217.0.64:8080: connect: connection refused" start-of-body= Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.640572 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x4rrg" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.640573 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4rrg" event={"ID":"d36b150f-af27-41a9-b699-db2207d44d58","Type":"ContainerDied","Data":"55c6c00e57fb64e6fc5f4e11cbe6f1b2d5bb5e8e5528a7d4f119a0c105dfe87c"} Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.640605 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" podUID="f51f220a-c9d3-4bb3-938a-72ab3ae24ee7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.64:8080/healthz\": dial tcp 10.217.0.64:8080: connect: connection refused" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.640641 4773 scope.go:117] "RemoveContainer" containerID="dcaac8f82a51952581bd3447fde7d0a516825e2a2c10803ac3a86db0330d1e87" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.644432 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qr2xs" event={"ID":"f28ee43e-c39a-4033-a36b-01a987f6c85e","Type":"ContainerDied","Data":"f193b9dd42b8876645d8072220ea6dae62bbf7e63fca216fbb9a7b760c9a5b2b"} Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.644530 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qr2xs" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.647817 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" event={"ID":"262910dc-030f-4767-833a-507c1a280963","Type":"ContainerDied","Data":"627103faf0c6509df47db9e80c33cd18a7247e2da096d8ec1466fac844b0ff2f"} Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.647947 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zpcds" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.659112 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzncg" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.659131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzncg" event={"ID":"3d6badc3-8b6a-4308-84b9-a6a1d6460878","Type":"ContainerDied","Data":"d9e777a5afb1411614059f5afd95f1b74c7451607f755421597315778eb02ed0"} Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.662921 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" podStartSLOduration=1.662902981 podStartE2EDuration="1.662902981s" podCreationTimestamp="2026-01-21 15:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:29:38.656134545 +0000 UTC m=+343.580624177" watchObservedRunningTime="2026-01-21 15:29:38.662902981 +0000 UTC m=+343.587392603" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.664570 4773 scope.go:117] "RemoveContainer" containerID="7c4df6c1bb5e6d8eb40f4535b3304c6e48f13e4410f1bd8606e68765f85f8b0c" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.665550 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mw7" event={"ID":"0cfe48f4-51ac-4a11-9dd3-b6087995d9ad","Type":"ContainerDied","Data":"915245cf347a80f9507681cc30b1a6c6073759a8aeb245e6ca3c688c5688b2b9"} Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.665660 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6mw7" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.687498 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x4rrg"] Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.692590 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x4rrg"] Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.694820 4773 scope.go:117] "RemoveContainer" containerID="1b63ab3dc83648f1114ef5a532e065a6842ad97d041ad8b3f805363e53d2a26b" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.714519 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qr2xs"] Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.717543 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qr2xs"] Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.729375 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzncg"] Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.731218 4773 scope.go:117] "RemoveContainer" containerID="0c83ca894a5ce65db4656e60e29cd02919a83380cfc445f2497d6d6e118555d8" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.732234 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vzncg"] Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.740267 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zpcds"] Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.747542 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zpcds"] Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.754839 4773 scope.go:117] "RemoveContainer" containerID="188048fece5cbdcd990847c3b1411b6cf76829f3f1919b2fe64a12b74395489f" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.764598 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6mw7"] Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.767371 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z6mw7"] Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.786472 4773 scope.go:117] "RemoveContainer" containerID="c0c83841ea929110342dca06c276be385d406550c40ee586024029c5c04ae923" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.804962 4773 scope.go:117] "RemoveContainer" containerID="ef3bd6511f8e7459721d629cd0705d7ca2707cdde18393d9f251bb354eca849d" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.821725 4773 scope.go:117] "RemoveContainer" containerID="097525dd10682aeca3047dca7b54c9fb439f55fd91ecfccade5af1c0d5bba740" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.838110 4773 scope.go:117] "RemoveContainer" containerID="82161a34ced2fe08152014a9cdaeb4ac87a3c027765d8cd917ec48d0b2186703" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.857768 4773 scope.go:117] "RemoveContainer" containerID="1768459ed151d20ff8d317cb6e7dab6c8df171533708f3859a2b9ce48c6e2a4f" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.873684 4773 scope.go:117] "RemoveContainer" containerID="f8c006a805e23dee902281718a6f4a5aef1e55c4b4a3c290340eefefe35ebf51" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.890714 4773 scope.go:117] "RemoveContainer" containerID="4dc7d8dca292844cd2a2e01858a573bd4e61d74b78be9adf1c7420e72e4947ee" Jan 21 15:29:38 crc kubenswrapper[4773]: I0121 15:29:38.909801 4773 scope.go:117] "RemoveContainer" containerID="d91b2014071a56219bfa4c05b48c0308b984013dd969cc1612279804668b37d1" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.390375 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" path="/var/lib/kubelet/pods/0cfe48f4-51ac-4a11-9dd3-b6087995d9ad/volumes" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.392025 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="262910dc-030f-4767-833a-507c1a280963" path="/var/lib/kubelet/pods/262910dc-030f-4767-833a-507c1a280963/volumes" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.392886 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" path="/var/lib/kubelet/pods/3d6badc3-8b6a-4308-84b9-a6a1d6460878/volumes" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.394196 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36b150f-af27-41a9-b699-db2207d44d58" path="/var/lib/kubelet/pods/d36b150f-af27-41a9-b699-db2207d44d58/volumes" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.395127 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f28ee43e-c39a-4033-a36b-01a987f6c85e" path="/var/lib/kubelet/pods/f28ee43e-c39a-4033-a36b-01a987f6c85e/volumes" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.487704 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd"] Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.488169 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" podUID="0a6d0d08-8ef5-49c9-b154-f5a014b6e47e" containerName="route-controller-manager" containerID="cri-o://a0940c936ca2396a43816a80485dcbfc47e0c1d948026168ee11fff3a845962e" gracePeriod=30 Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.696640 4773 generic.go:334] "Generic (PLEG): container finished" podID="0a6d0d08-8ef5-49c9-b154-f5a014b6e47e" containerID="a0940c936ca2396a43816a80485dcbfc47e0c1d948026168ee11fff3a845962e" exitCode=0 Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.696759 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" event={"ID":"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e","Type":"ContainerDied","Data":"a0940c936ca2396a43816a80485dcbfc47e0c1d948026168ee11fff3a845962e"} Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.702674 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hxs87" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.768671 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nph5h"] Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774292 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262910dc-030f-4767-833a-507c1a280963" containerName="marketplace-operator" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774318 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="262910dc-030f-4767-833a-507c1a280963" containerName="marketplace-operator" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774328 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36b150f-af27-41a9-b699-db2207d44d58" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774336 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36b150f-af27-41a9-b699-db2207d44d58" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774351 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerName="extract-utilities" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774359 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerName="extract-utilities" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774374 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerName="extract-content" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774381 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerName="extract-content" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774391 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28ee43e-c39a-4033-a36b-01a987f6c85e" containerName="extract-content" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774399 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28ee43e-c39a-4033-a36b-01a987f6c85e" containerName="extract-content" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774410 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774418 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774426 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28ee43e-c39a-4033-a36b-01a987f6c85e" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774433 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28ee43e-c39a-4033-a36b-01a987f6c85e" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774443 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28ee43e-c39a-4033-a36b-01a987f6c85e" containerName="extract-utilities" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774451 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28ee43e-c39a-4033-a36b-01a987f6c85e" containerName="extract-utilities" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774459 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774466 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774476 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" containerName="extract-content" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774483 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" containerName="extract-content" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774491 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36b150f-af27-41a9-b699-db2207d44d58" containerName="extract-utilities" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774500 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36b150f-af27-41a9-b699-db2207d44d58" containerName="extract-utilities" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774511 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" containerName="extract-utilities" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774518 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" containerName="extract-utilities" Jan 21 15:29:39 crc kubenswrapper[4773]: E0121 15:29:39.774530 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36b150f-af27-41a9-b699-db2207d44d58" containerName="extract-content" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774538 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36b150f-af27-41a9-b699-db2207d44d58" containerName="extract-content" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774660 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cfe48f4-51ac-4a11-9dd3-b6087995d9ad" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774674 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="262910dc-030f-4767-833a-507c1a280963" containerName="marketplace-operator" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774704 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6badc3-8b6a-4308-84b9-a6a1d6460878" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774719 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f28ee43e-c39a-4033-a36b-01a987f6c85e" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.774729 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36b150f-af27-41a9-b699-db2207d44d58" containerName="registry-server" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.775586 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.775934 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nph5h"] Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.777658 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.938560 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f1326-f825-4ac0-90dd-a02f8dc8756d-catalog-content\") pod \"redhat-marketplace-nph5h\" (UID: \"520f1326-f825-4ac0-90dd-a02f8dc8756d\") " pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.938622 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f1326-f825-4ac0-90dd-a02f8dc8756d-utilities\") pod \"redhat-marketplace-nph5h\" (UID: \"520f1326-f825-4ac0-90dd-a02f8dc8756d\") " pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:39 crc kubenswrapper[4773]: I0121 15:29:39.938658 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25589\" (UniqueName: \"kubernetes.io/projected/520f1326-f825-4ac0-90dd-a02f8dc8756d-kube-api-access-25589\") pod \"redhat-marketplace-nph5h\" (UID: \"520f1326-f825-4ac0-90dd-a02f8dc8756d\") " pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.016762 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.039938 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f1326-f825-4ac0-90dd-a02f8dc8756d-catalog-content\") pod \"redhat-marketplace-nph5h\" (UID: \"520f1326-f825-4ac0-90dd-a02f8dc8756d\") " pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.040005 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f1326-f825-4ac0-90dd-a02f8dc8756d-utilities\") pod \"redhat-marketplace-nph5h\" (UID: \"520f1326-f825-4ac0-90dd-a02f8dc8756d\") " pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.040047 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25589\" (UniqueName: \"kubernetes.io/projected/520f1326-f825-4ac0-90dd-a02f8dc8756d-kube-api-access-25589\") pod \"redhat-marketplace-nph5h\" (UID: \"520f1326-f825-4ac0-90dd-a02f8dc8756d\") " pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.040829 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f1326-f825-4ac0-90dd-a02f8dc8756d-catalog-content\") pod \"redhat-marketplace-nph5h\" (UID: \"520f1326-f825-4ac0-90dd-a02f8dc8756d\") " pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.040873 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f1326-f825-4ac0-90dd-a02f8dc8756d-utilities\") pod \"redhat-marketplace-nph5h\" (UID: \"520f1326-f825-4ac0-90dd-a02f8dc8756d\") " pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.058159 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25589\" (UniqueName: \"kubernetes.io/projected/520f1326-f825-4ac0-90dd-a02f8dc8756d-kube-api-access-25589\") pod \"redhat-marketplace-nph5h\" (UID: \"520f1326-f825-4ac0-90dd-a02f8dc8756d\") " pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.098613 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.140675 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm9p5\" (UniqueName: \"kubernetes.io/projected/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-kube-api-access-sm9p5\") pod \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.140743 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-client-ca\") pod \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.140858 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-config\") pod \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.140919 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-serving-cert\") pod \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\" (UID: \"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e\") " Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.141602 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-client-ca" (OuterVolumeSpecName: "client-ca") pod "0a6d0d08-8ef5-49c9-b154-f5a014b6e47e" (UID: "0a6d0d08-8ef5-49c9-b154-f5a014b6e47e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.141847 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-config" (OuterVolumeSpecName: "config") pod "0a6d0d08-8ef5-49c9-b154-f5a014b6e47e" (UID: "0a6d0d08-8ef5-49c9-b154-f5a014b6e47e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.143536 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-kube-api-access-sm9p5" (OuterVolumeSpecName: "kube-api-access-sm9p5") pod "0a6d0d08-8ef5-49c9-b154-f5a014b6e47e" (UID: "0a6d0d08-8ef5-49c9-b154-f5a014b6e47e"). InnerVolumeSpecName "kube-api-access-sm9p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.143925 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0a6d0d08-8ef5-49c9-b154-f5a014b6e47e" (UID: "0a6d0d08-8ef5-49c9-b154-f5a014b6e47e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.242602 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.242642 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.242653 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm9p5\" (UniqueName: \"kubernetes.io/projected/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-kube-api-access-sm9p5\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.242663 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.361480 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bn8fn"] Jan 21 15:29:40 crc kubenswrapper[4773]: E0121 15:29:40.361735 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6d0d08-8ef5-49c9-b154-f5a014b6e47e" containerName="route-controller-manager" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.361746 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6d0d08-8ef5-49c9-b154-f5a014b6e47e" containerName="route-controller-manager" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.361845 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6d0d08-8ef5-49c9-b154-f5a014b6e47e" containerName="route-controller-manager" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.362503 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.365881 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.371123 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bn8fn"] Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.493949 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nph5h"] Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.546274 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a70f2296-498b-4347-b80d-1d26a02d7d93-catalog-content\") pod \"redhat-operators-bn8fn\" (UID: \"a70f2296-498b-4347-b80d-1d26a02d7d93\") " pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.546634 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a70f2296-498b-4347-b80d-1d26a02d7d93-utilities\") pod \"redhat-operators-bn8fn\" (UID: \"a70f2296-498b-4347-b80d-1d26a02d7d93\") " pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.546683 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmk6c\" (UniqueName: \"kubernetes.io/projected/a70f2296-498b-4347-b80d-1d26a02d7d93-kube-api-access-hmk6c\") pod \"redhat-operators-bn8fn\" (UID: \"a70f2296-498b-4347-b80d-1d26a02d7d93\") " pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.648379 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a70f2296-498b-4347-b80d-1d26a02d7d93-catalog-content\") pod \"redhat-operators-bn8fn\" (UID: \"a70f2296-498b-4347-b80d-1d26a02d7d93\") " pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.648434 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a70f2296-498b-4347-b80d-1d26a02d7d93-utilities\") pod \"redhat-operators-bn8fn\" (UID: \"a70f2296-498b-4347-b80d-1d26a02d7d93\") " pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.648467 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmk6c\" (UniqueName: \"kubernetes.io/projected/a70f2296-498b-4347-b80d-1d26a02d7d93-kube-api-access-hmk6c\") pod \"redhat-operators-bn8fn\" (UID: \"a70f2296-498b-4347-b80d-1d26a02d7d93\") " pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.649223 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a70f2296-498b-4347-b80d-1d26a02d7d93-catalog-content\") pod \"redhat-operators-bn8fn\" (UID: \"a70f2296-498b-4347-b80d-1d26a02d7d93\") " pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.649762 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a70f2296-498b-4347-b80d-1d26a02d7d93-utilities\") pod \"redhat-operators-bn8fn\" (UID: \"a70f2296-498b-4347-b80d-1d26a02d7d93\") " pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.666947 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmk6c\" (UniqueName: \"kubernetes.io/projected/a70f2296-498b-4347-b80d-1d26a02d7d93-kube-api-access-hmk6c\") pod \"redhat-operators-bn8fn\" (UID: \"a70f2296-498b-4347-b80d-1d26a02d7d93\") " pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.683521 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.708082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" event={"ID":"0a6d0d08-8ef5-49c9-b154-f5a014b6e47e","Type":"ContainerDied","Data":"03daf6e78a62315d7249403b6db8e09ce9e569830c04100b490b5f7807e74ed9"} Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.708109 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.708133 4773 scope.go:117] "RemoveContainer" containerID="a0940c936ca2396a43816a80485dcbfc47e0c1d948026168ee11fff3a845962e" Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.711322 4773 generic.go:334] "Generic (PLEG): container finished" podID="520f1326-f825-4ac0-90dd-a02f8dc8756d" containerID="a6f1164c6f87a0872880bafbe1af10276f269051ec7a01da18db79f9d4d499a7" exitCode=0 Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.711590 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nph5h" event={"ID":"520f1326-f825-4ac0-90dd-a02f8dc8756d","Type":"ContainerDied","Data":"a6f1164c6f87a0872880bafbe1af10276f269051ec7a01da18db79f9d4d499a7"} Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.712012 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nph5h" event={"ID":"520f1326-f825-4ac0-90dd-a02f8dc8756d","Type":"ContainerStarted","Data":"de29f947ca6db9c03c0815e22f325b27eb59096b4a11fec91ede9c2506ef5e73"} Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.746568 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd"] Jan 21 15:29:40 crc kubenswrapper[4773]: I0121 15:29:40.750000 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d76d5567-njpwd"] Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.084182 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bn8fn"] Jan 21 15:29:41 crc kubenswrapper[4773]: W0121 15:29:41.090180 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda70f2296_498b_4347_b80d_1d26a02d7d93.slice/crio-fd48b82a89687eaf44375f4ce3171bb8949016cc96e2ca9f74c9d8ca77f1475e WatchSource:0}: Error finding container fd48b82a89687eaf44375f4ce3171bb8949016cc96e2ca9f74c9d8ca77f1475e: Status 404 returned error can't find the container with id fd48b82a89687eaf44375f4ce3171bb8949016cc96e2ca9f74c9d8ca77f1475e Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.390934 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a6d0d08-8ef5-49c9-b154-f5a014b6e47e" path="/var/lib/kubelet/pods/0a6d0d08-8ef5-49c9-b154-f5a014b6e47e/volumes" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.440578 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp"] Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.441205 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.443717 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.443971 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.444106 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.444307 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.444506 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.445922 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.461876 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp"] Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.561458 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b8345dd-3088-4789-8ec4-a35576d13ea9-client-ca\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.561529 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcmvx\" (UniqueName: \"kubernetes.io/projected/0b8345dd-3088-4789-8ec4-a35576d13ea9-kube-api-access-vcmvx\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.561596 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b8345dd-3088-4789-8ec4-a35576d13ea9-config\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.561634 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b8345dd-3088-4789-8ec4-a35576d13ea9-serving-cert\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.662661 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcmvx\" (UniqueName: \"kubernetes.io/projected/0b8345dd-3088-4789-8ec4-a35576d13ea9-kube-api-access-vcmvx\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.662743 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b8345dd-3088-4789-8ec4-a35576d13ea9-config\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.662793 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b8345dd-3088-4789-8ec4-a35576d13ea9-serving-cert\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.662859 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b8345dd-3088-4789-8ec4-a35576d13ea9-client-ca\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.663842 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b8345dd-3088-4789-8ec4-a35576d13ea9-client-ca\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.665145 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b8345dd-3088-4789-8ec4-a35576d13ea9-config\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.680124 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcmvx\" (UniqueName: \"kubernetes.io/projected/0b8345dd-3088-4789-8ec4-a35576d13ea9-kube-api-access-vcmvx\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.684503 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b8345dd-3088-4789-8ec4-a35576d13ea9-serving-cert\") pod \"route-controller-manager-dc847bb58-sfvwp\" (UID: \"0b8345dd-3088-4789-8ec4-a35576d13ea9\") " pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.720008 4773 generic.go:334] "Generic (PLEG): container finished" podID="520f1326-f825-4ac0-90dd-a02f8dc8756d" containerID="11a0816ab83cfb85b2716dd16a9ed8b8eb761a6fd48bf6d92ee6fd91af8c1c29" exitCode=0 Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.720111 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nph5h" event={"ID":"520f1326-f825-4ac0-90dd-a02f8dc8756d","Type":"ContainerDied","Data":"11a0816ab83cfb85b2716dd16a9ed8b8eb761a6fd48bf6d92ee6fd91af8c1c29"} Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.721983 4773 generic.go:334] "Generic (PLEG): container finished" podID="a70f2296-498b-4347-b80d-1d26a02d7d93" containerID="0dbbdac965a5ca004c748b4609de584ef32d1588992b604eb6546901d846afd2" exitCode=0 Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.722047 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn8fn" event={"ID":"a70f2296-498b-4347-b80d-1d26a02d7d93","Type":"ContainerDied","Data":"0dbbdac965a5ca004c748b4609de584ef32d1588992b604eb6546901d846afd2"} Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.722126 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn8fn" event={"ID":"a70f2296-498b-4347-b80d-1d26a02d7d93","Type":"ContainerStarted","Data":"fd48b82a89687eaf44375f4ce3171bb8949016cc96e2ca9f74c9d8ca77f1475e"} Jan 21 15:29:41 crc kubenswrapper[4773]: I0121 15:29:41.793220 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.172509 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w5rdq"] Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.174077 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.175462 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5rdq"] Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.177771 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.185026 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp"] Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.271629 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e909e2-8f8b-47ca-bfef-c71fb0a08533-catalog-content\") pod \"certified-operators-w5rdq\" (UID: \"37e909e2-8f8b-47ca-bfef-c71fb0a08533\") " pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.271711 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l84kh\" (UniqueName: \"kubernetes.io/projected/37e909e2-8f8b-47ca-bfef-c71fb0a08533-kube-api-access-l84kh\") pod \"certified-operators-w5rdq\" (UID: \"37e909e2-8f8b-47ca-bfef-c71fb0a08533\") " pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.271747 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e909e2-8f8b-47ca-bfef-c71fb0a08533-utilities\") pod \"certified-operators-w5rdq\" (UID: \"37e909e2-8f8b-47ca-bfef-c71fb0a08533\") " pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.372851 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e909e2-8f8b-47ca-bfef-c71fb0a08533-catalog-content\") pod \"certified-operators-w5rdq\" (UID: \"37e909e2-8f8b-47ca-bfef-c71fb0a08533\") " pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.373336 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l84kh\" (UniqueName: \"kubernetes.io/projected/37e909e2-8f8b-47ca-bfef-c71fb0a08533-kube-api-access-l84kh\") pod \"certified-operators-w5rdq\" (UID: \"37e909e2-8f8b-47ca-bfef-c71fb0a08533\") " pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.373383 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e909e2-8f8b-47ca-bfef-c71fb0a08533-utilities\") pod \"certified-operators-w5rdq\" (UID: \"37e909e2-8f8b-47ca-bfef-c71fb0a08533\") " pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.373654 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e909e2-8f8b-47ca-bfef-c71fb0a08533-catalog-content\") pod \"certified-operators-w5rdq\" (UID: \"37e909e2-8f8b-47ca-bfef-c71fb0a08533\") " pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.373771 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e909e2-8f8b-47ca-bfef-c71fb0a08533-utilities\") pod \"certified-operators-w5rdq\" (UID: \"37e909e2-8f8b-47ca-bfef-c71fb0a08533\") " pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.395080 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l84kh\" (UniqueName: \"kubernetes.io/projected/37e909e2-8f8b-47ca-bfef-c71fb0a08533-kube-api-access-l84kh\") pod \"certified-operators-w5rdq\" (UID: \"37e909e2-8f8b-47ca-bfef-c71fb0a08533\") " pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.514278 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.731682 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nph5h" event={"ID":"520f1326-f825-4ac0-90dd-a02f8dc8756d","Type":"ContainerStarted","Data":"4a2c0d1dbe8b9344a4648a5000d4331e1a7301a27dfb58ef3a8421e8714e5ab1"} Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.733071 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" event={"ID":"0b8345dd-3088-4789-8ec4-a35576d13ea9","Type":"ContainerStarted","Data":"7aff5289088e667ab2b428445cb791f6c90cbf8a8ab7fdc05baab3057808a1dc"} Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.733094 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" event={"ID":"0b8345dd-3088-4789-8ec4-a35576d13ea9","Type":"ContainerStarted","Data":"69b75215cd1cdfa56de1b48cce9f0a9abf79b7b395d9d66047a5f219afaa5a08"} Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.733736 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.753124 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nph5h" podStartSLOduration=2.240735689 podStartE2EDuration="3.753104761s" podCreationTimestamp="2026-01-21 15:29:39 +0000 UTC" firstStartedPulling="2026-01-21 15:29:40.712758682 +0000 UTC m=+345.637248304" lastFinishedPulling="2026-01-21 15:29:42.225127754 +0000 UTC m=+347.149617376" observedRunningTime="2026-01-21 15:29:42.750286974 +0000 UTC m=+347.674776596" watchObservedRunningTime="2026-01-21 15:29:42.753104761 +0000 UTC m=+347.677594393" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.769376 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wmgvw"] Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.775135 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.775327 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" podStartSLOduration=3.775314492 podStartE2EDuration="3.775314492s" podCreationTimestamp="2026-01-21 15:29:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:29:42.771742144 +0000 UTC m=+347.696231786" watchObservedRunningTime="2026-01-21 15:29:42.775314492 +0000 UTC m=+347.699804124" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.777359 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.784225 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmgvw"] Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.881805 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b345a6-34e6-43c7-899b-5e35c36310c4-catalog-content\") pod \"community-operators-wmgvw\" (UID: \"a6b345a6-34e6-43c7-899b-5e35c36310c4\") " pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.881858 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pl2s\" (UniqueName: \"kubernetes.io/projected/a6b345a6-34e6-43c7-899b-5e35c36310c4-kube-api-access-2pl2s\") pod \"community-operators-wmgvw\" (UID: \"a6b345a6-34e6-43c7-899b-5e35c36310c4\") " pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.881901 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b345a6-34e6-43c7-899b-5e35c36310c4-utilities\") pod \"community-operators-wmgvw\" (UID: \"a6b345a6-34e6-43c7-899b-5e35c36310c4\") " pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.940476 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5rdq"] Jan 21 15:29:42 crc kubenswrapper[4773]: W0121 15:29:42.953136 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37e909e2_8f8b_47ca_bfef_c71fb0a08533.slice/crio-d65844410ba709008cbf53cf19f2e71d89bc76348cee755653c7678dc45efc93 WatchSource:0}: Error finding container d65844410ba709008cbf53cf19f2e71d89bc76348cee755653c7678dc45efc93: Status 404 returned error can't find the container with id d65844410ba709008cbf53cf19f2e71d89bc76348cee755653c7678dc45efc93 Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.971096 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dc847bb58-sfvwp" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.983316 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b345a6-34e6-43c7-899b-5e35c36310c4-catalog-content\") pod \"community-operators-wmgvw\" (UID: \"a6b345a6-34e6-43c7-899b-5e35c36310c4\") " pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.983364 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pl2s\" (UniqueName: \"kubernetes.io/projected/a6b345a6-34e6-43c7-899b-5e35c36310c4-kube-api-access-2pl2s\") pod \"community-operators-wmgvw\" (UID: \"a6b345a6-34e6-43c7-899b-5e35c36310c4\") " pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.983412 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b345a6-34e6-43c7-899b-5e35c36310c4-utilities\") pod \"community-operators-wmgvw\" (UID: \"a6b345a6-34e6-43c7-899b-5e35c36310c4\") " pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.983949 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b345a6-34e6-43c7-899b-5e35c36310c4-catalog-content\") pod \"community-operators-wmgvw\" (UID: \"a6b345a6-34e6-43c7-899b-5e35c36310c4\") " pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:42 crc kubenswrapper[4773]: I0121 15:29:42.983973 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b345a6-34e6-43c7-899b-5e35c36310c4-utilities\") pod \"community-operators-wmgvw\" (UID: \"a6b345a6-34e6-43c7-899b-5e35c36310c4\") " pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:43 crc kubenswrapper[4773]: I0121 15:29:43.007961 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pl2s\" (UniqueName: \"kubernetes.io/projected/a6b345a6-34e6-43c7-899b-5e35c36310c4-kube-api-access-2pl2s\") pod \"community-operators-wmgvw\" (UID: \"a6b345a6-34e6-43c7-899b-5e35c36310c4\") " pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:43 crc kubenswrapper[4773]: I0121 15:29:43.093893 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:43 crc kubenswrapper[4773]: I0121 15:29:43.517454 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmgvw"] Jan 21 15:29:43 crc kubenswrapper[4773]: I0121 15:29:43.740719 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmgvw" event={"ID":"a6b345a6-34e6-43c7-899b-5e35c36310c4","Type":"ContainerStarted","Data":"5e97498ab1fdc9545b6531d0e634f1342a552b2ea87da64c0a55921f29951691"} Jan 21 15:29:43 crc kubenswrapper[4773]: I0121 15:29:43.742690 4773 generic.go:334] "Generic (PLEG): container finished" podID="37e909e2-8f8b-47ca-bfef-c71fb0a08533" containerID="8480bd1eada7ab726cde543ffa6ece5b1d2473242568047c205f4b80809d2d49" exitCode=0 Jan 21 15:29:43 crc kubenswrapper[4773]: I0121 15:29:43.742766 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5rdq" event={"ID":"37e909e2-8f8b-47ca-bfef-c71fb0a08533","Type":"ContainerDied","Data":"8480bd1eada7ab726cde543ffa6ece5b1d2473242568047c205f4b80809d2d49"} Jan 21 15:29:43 crc kubenswrapper[4773]: I0121 15:29:43.742788 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5rdq" event={"ID":"37e909e2-8f8b-47ca-bfef-c71fb0a08533","Type":"ContainerStarted","Data":"d65844410ba709008cbf53cf19f2e71d89bc76348cee755653c7678dc45efc93"} Jan 21 15:29:43 crc kubenswrapper[4773]: I0121 15:29:43.747059 4773 generic.go:334] "Generic (PLEG): container finished" podID="a70f2296-498b-4347-b80d-1d26a02d7d93" containerID="85e31d2f191b67bfd1b6099830f386774a3cfbd3e5aecd7a26585785b203a321" exitCode=0 Jan 21 15:29:43 crc kubenswrapper[4773]: I0121 15:29:43.747670 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn8fn" event={"ID":"a70f2296-498b-4347-b80d-1d26a02d7d93","Type":"ContainerDied","Data":"85e31d2f191b67bfd1b6099830f386774a3cfbd3e5aecd7a26585785b203a321"} Jan 21 15:29:44 crc kubenswrapper[4773]: I0121 15:29:44.754577 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn8fn" event={"ID":"a70f2296-498b-4347-b80d-1d26a02d7d93","Type":"ContainerStarted","Data":"6fe5488bd5e4f8478c48aceaaa89d8768d6027483cf08bad4e2e0bb7d95c72db"} Jan 21 15:29:44 crc kubenswrapper[4773]: I0121 15:29:44.756844 4773 generic.go:334] "Generic (PLEG): container finished" podID="a6b345a6-34e6-43c7-899b-5e35c36310c4" containerID="96534ef3cbc14988a3d7543407ffff1bbf82e153285e8cf92307f4b7137ca029" exitCode=0 Jan 21 15:29:44 crc kubenswrapper[4773]: I0121 15:29:44.756883 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmgvw" event={"ID":"a6b345a6-34e6-43c7-899b-5e35c36310c4","Type":"ContainerDied","Data":"96534ef3cbc14988a3d7543407ffff1bbf82e153285e8cf92307f4b7137ca029"} Jan 21 15:29:44 crc kubenswrapper[4773]: I0121 15:29:44.759446 4773 generic.go:334] "Generic (PLEG): container finished" podID="37e909e2-8f8b-47ca-bfef-c71fb0a08533" containerID="bc669ac79590553e2c0c12edfec8fb8a9e6095b32eda1fb56389760cd6af1b3d" exitCode=0 Jan 21 15:29:44 crc kubenswrapper[4773]: I0121 15:29:44.759515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5rdq" event={"ID":"37e909e2-8f8b-47ca-bfef-c71fb0a08533","Type":"ContainerDied","Data":"bc669ac79590553e2c0c12edfec8fb8a9e6095b32eda1fb56389760cd6af1b3d"} Jan 21 15:29:44 crc kubenswrapper[4773]: I0121 15:29:44.778297 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bn8fn" podStartSLOduration=2.3004007140000002 podStartE2EDuration="4.778273594s" podCreationTimestamp="2026-01-21 15:29:40 +0000 UTC" firstStartedPulling="2026-01-21 15:29:41.723680177 +0000 UTC m=+346.648169799" lastFinishedPulling="2026-01-21 15:29:44.201553057 +0000 UTC m=+349.126042679" observedRunningTime="2026-01-21 15:29:44.77598297 +0000 UTC m=+349.700472592" watchObservedRunningTime="2026-01-21 15:29:44.778273594 +0000 UTC m=+349.702763226" Jan 21 15:29:45 crc kubenswrapper[4773]: I0121 15:29:45.767202 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5rdq" event={"ID":"37e909e2-8f8b-47ca-bfef-c71fb0a08533","Type":"ContainerStarted","Data":"c32ff1b5a4ec5b2593f9ebe8de4941d9f640e4b4a0dedefc19ca30d54e500343"} Jan 21 15:29:45 crc kubenswrapper[4773]: I0121 15:29:45.769448 4773 generic.go:334] "Generic (PLEG): container finished" podID="a6b345a6-34e6-43c7-899b-5e35c36310c4" containerID="d47485363a4999d1a87b734a2b693cc47014dac24d19b983429aa68de2abf4c5" exitCode=0 Jan 21 15:29:45 crc kubenswrapper[4773]: I0121 15:29:45.769487 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmgvw" event={"ID":"a6b345a6-34e6-43c7-899b-5e35c36310c4","Type":"ContainerDied","Data":"d47485363a4999d1a87b734a2b693cc47014dac24d19b983429aa68de2abf4c5"} Jan 21 15:29:45 crc kubenswrapper[4773]: I0121 15:29:45.789607 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w5rdq" podStartSLOduration=2.13005282 podStartE2EDuration="3.78958995s" podCreationTimestamp="2026-01-21 15:29:42 +0000 UTC" firstStartedPulling="2026-01-21 15:29:43.744426427 +0000 UTC m=+348.668916049" lastFinishedPulling="2026-01-21 15:29:45.403963557 +0000 UTC m=+350.328453179" observedRunningTime="2026-01-21 15:29:45.783553734 +0000 UTC m=+350.708043356" watchObservedRunningTime="2026-01-21 15:29:45.78958995 +0000 UTC m=+350.714079572" Jan 21 15:29:46 crc kubenswrapper[4773]: I0121 15:29:46.777915 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmgvw" event={"ID":"a6b345a6-34e6-43c7-899b-5e35c36310c4","Type":"ContainerStarted","Data":"799444c070c8bbe160f7e8e1e96b0a0ecad2a437f97eccf0e406e613c3981dd6"} Jan 21 15:29:46 crc kubenswrapper[4773]: I0121 15:29:46.793087 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wmgvw" podStartSLOduration=3.397845429 podStartE2EDuration="4.793012579s" podCreationTimestamp="2026-01-21 15:29:42 +0000 UTC" firstStartedPulling="2026-01-21 15:29:44.758915991 +0000 UTC m=+349.683405613" lastFinishedPulling="2026-01-21 15:29:46.154083141 +0000 UTC m=+351.078572763" observedRunningTime="2026-01-21 15:29:46.791918049 +0000 UTC m=+351.716407701" watchObservedRunningTime="2026-01-21 15:29:46.793012579 +0000 UTC m=+351.717502201" Jan 21 15:29:50 crc kubenswrapper[4773]: I0121 15:29:50.099167 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:50 crc kubenswrapper[4773]: I0121 15:29:50.099756 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:50 crc kubenswrapper[4773]: I0121 15:29:50.146116 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:50 crc kubenswrapper[4773]: I0121 15:29:50.684520 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:50 crc kubenswrapper[4773]: I0121 15:29:50.684969 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:50 crc kubenswrapper[4773]: I0121 15:29:50.728141 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:50 crc kubenswrapper[4773]: I0121 15:29:50.833449 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nph5h" Jan 21 15:29:50 crc kubenswrapper[4773]: I0121 15:29:50.834098 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bn8fn" Jan 21 15:29:52 crc kubenswrapper[4773]: I0121 15:29:52.515718 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:52 crc kubenswrapper[4773]: I0121 15:29:52.516075 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:52 crc kubenswrapper[4773]: I0121 15:29:52.569567 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:52 crc kubenswrapper[4773]: I0121 15:29:52.847523 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w5rdq" Jan 21 15:29:53 crc kubenswrapper[4773]: I0121 15:29:53.094246 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:53 crc kubenswrapper[4773]: I0121 15:29:53.094294 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:53 crc kubenswrapper[4773]: I0121 15:29:53.134044 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:53 crc kubenswrapper[4773]: I0121 15:29:53.877676 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wmgvw" Jan 21 15:29:55 crc kubenswrapper[4773]: I0121 15:29:55.205970 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:29:55 crc kubenswrapper[4773]: I0121 15:29:55.206040 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.169183 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh"] Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.170133 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.178043 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.178328 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.180427 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh"] Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.241179 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1eed7a3-89e0-412b-b931-9e3905a32965-secret-volume\") pod \"collect-profiles-29483490-vw8sh\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.241237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1eed7a3-89e0-412b-b931-9e3905a32965-config-volume\") pod \"collect-profiles-29483490-vw8sh\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.241427 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgsx7\" (UniqueName: \"kubernetes.io/projected/d1eed7a3-89e0-412b-b931-9e3905a32965-kube-api-access-lgsx7\") pod \"collect-profiles-29483490-vw8sh\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.342823 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1eed7a3-89e0-412b-b931-9e3905a32965-secret-volume\") pod \"collect-profiles-29483490-vw8sh\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.342873 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1eed7a3-89e0-412b-b931-9e3905a32965-config-volume\") pod \"collect-profiles-29483490-vw8sh\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.342952 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgsx7\" (UniqueName: \"kubernetes.io/projected/d1eed7a3-89e0-412b-b931-9e3905a32965-kube-api-access-lgsx7\") pod \"collect-profiles-29483490-vw8sh\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.344003 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1eed7a3-89e0-412b-b931-9e3905a32965-config-volume\") pod \"collect-profiles-29483490-vw8sh\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.348985 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1eed7a3-89e0-412b-b931-9e3905a32965-secret-volume\") pod \"collect-profiles-29483490-vw8sh\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.360303 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgsx7\" (UniqueName: \"kubernetes.io/projected/d1eed7a3-89e0-412b-b931-9e3905a32965-kube-api-access-lgsx7\") pod \"collect-profiles-29483490-vw8sh\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.486818 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:00 crc kubenswrapper[4773]: I0121 15:30:00.878986 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh"] Jan 21 15:30:01 crc kubenswrapper[4773]: I0121 15:30:01.848333 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" event={"ID":"d1eed7a3-89e0-412b-b931-9e3905a32965","Type":"ContainerStarted","Data":"6f3fff73ebc1aa65e87e5d4adc59e5b125658695ede2038b8aebc7b5d34a59fe"} Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.464944 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" podUID="a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" containerName="registry" containerID="cri-o://13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8" gracePeriod=30 Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.828672 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.866728 4773 generic.go:334] "Generic (PLEG): container finished" podID="a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" containerID="13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8" exitCode=0 Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.866820 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.866825 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" event={"ID":"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d","Type":"ContainerDied","Data":"13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8"} Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.866928 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cb2vm" event={"ID":"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d","Type":"ContainerDied","Data":"5f3603d112dc821a9501675598e35d1ceefbff7a584f774951e2ae2d419ff3ae"} Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.866958 4773 scope.go:117] "RemoveContainer" containerID="13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.871222 4773 generic.go:334] "Generic (PLEG): container finished" podID="d1eed7a3-89e0-412b-b931-9e3905a32965" containerID="cebb40f15a1fe212d9b28102edc3463fd99fdd63727d02c79f72e2271b1a098f" exitCode=0 Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.871274 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" event={"ID":"d1eed7a3-89e0-412b-b931-9e3905a32965","Type":"ContainerDied","Data":"cebb40f15a1fe212d9b28102edc3463fd99fdd63727d02c79f72e2271b1a098f"} Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.892609 4773 scope.go:117] "RemoveContainer" containerID="13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8" Jan 21 15:30:02 crc kubenswrapper[4773]: E0121 15:30:02.893136 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8\": container with ID starting with 13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8 not found: ID does not exist" containerID="13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.893195 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8"} err="failed to get container status \"13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8\": rpc error: code = NotFound desc = could not find container \"13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8\": container with ID starting with 13f588325a3ebd87d8f5b2945cadf425045a35d023eaa7fabca7a5ee9367e0d8 not found: ID does not exist" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.978067 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-tls\") pod \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.978242 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.978382 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-certificates\") pod \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.978426 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-installation-pull-secrets\") pod \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.978452 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-ca-trust-extracted\") pod \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.978483 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-trusted-ca\") pod \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.978551 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksc8d\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-kube-api-access-ksc8d\") pod \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.978572 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-bound-sa-token\") pod \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\" (UID: \"a0f29f37-d4e3-4767-8077-45ffbf1ddd6d\") " Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.980084 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.980098 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.985922 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-kube-api-access-ksc8d" (OuterVolumeSpecName: "kube-api-access-ksc8d") pod "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d"). InnerVolumeSpecName "kube-api-access-ksc8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.986139 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.986342 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.987447 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.991728 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 15:30:02 crc kubenswrapper[4773]: I0121 15:30:02.997623 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" (UID: "a0f29f37-d4e3-4767-8077-45ffbf1ddd6d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:30:03 crc kubenswrapper[4773]: I0121 15:30:03.079865 4773 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4773]: I0121 15:30:03.079921 4773 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4773]: I0121 15:30:03.079933 4773 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4773]: I0121 15:30:03.079942 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4773]: I0121 15:30:03.079951 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksc8d\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-kube-api-access-ksc8d\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4773]: I0121 15:30:03.079959 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4773]: I0121 15:30:03.079967 4773 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:03 crc kubenswrapper[4773]: I0121 15:30:03.193993 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cb2vm"] Jan 21 15:30:03 crc kubenswrapper[4773]: I0121 15:30:03.197720 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cb2vm"] Jan 21 15:30:03 crc kubenswrapper[4773]: I0121 15:30:03.390403 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" path="/var/lib/kubelet/pods/a0f29f37-d4e3-4767-8077-45ffbf1ddd6d/volumes" Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.109095 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.295611 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1eed7a3-89e0-412b-b931-9e3905a32965-secret-volume\") pod \"d1eed7a3-89e0-412b-b931-9e3905a32965\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.295735 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgsx7\" (UniqueName: \"kubernetes.io/projected/d1eed7a3-89e0-412b-b931-9e3905a32965-kube-api-access-lgsx7\") pod \"d1eed7a3-89e0-412b-b931-9e3905a32965\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.295820 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1eed7a3-89e0-412b-b931-9e3905a32965-config-volume\") pod \"d1eed7a3-89e0-412b-b931-9e3905a32965\" (UID: \"d1eed7a3-89e0-412b-b931-9e3905a32965\") " Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.296901 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1eed7a3-89e0-412b-b931-9e3905a32965-config-volume" (OuterVolumeSpecName: "config-volume") pod "d1eed7a3-89e0-412b-b931-9e3905a32965" (UID: "d1eed7a3-89e0-412b-b931-9e3905a32965"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.299647 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1eed7a3-89e0-412b-b931-9e3905a32965-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d1eed7a3-89e0-412b-b931-9e3905a32965" (UID: "d1eed7a3-89e0-412b-b931-9e3905a32965"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.299809 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1eed7a3-89e0-412b-b931-9e3905a32965-kube-api-access-lgsx7" (OuterVolumeSpecName: "kube-api-access-lgsx7") pod "d1eed7a3-89e0-412b-b931-9e3905a32965" (UID: "d1eed7a3-89e0-412b-b931-9e3905a32965"). InnerVolumeSpecName "kube-api-access-lgsx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.397156 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgsx7\" (UniqueName: \"kubernetes.io/projected/d1eed7a3-89e0-412b-b931-9e3905a32965-kube-api-access-lgsx7\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.397188 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1eed7a3-89e0-412b-b931-9e3905a32965-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.397197 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1eed7a3-89e0-412b-b931-9e3905a32965-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.884402 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" event={"ID":"d1eed7a3-89e0-412b-b931-9e3905a32965","Type":"ContainerDied","Data":"6f3fff73ebc1aa65e87e5d4adc59e5b125658695ede2038b8aebc7b5d34a59fe"} Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.884464 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f3fff73ebc1aa65e87e5d4adc59e5b125658695ede2038b8aebc7b5d34a59fe" Jan 21 15:30:04 crc kubenswrapper[4773]: I0121 15:30:04.884548 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh" Jan 21 15:30:25 crc kubenswrapper[4773]: I0121 15:30:25.206226 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:30:25 crc kubenswrapper[4773]: I0121 15:30:25.206826 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:30:55 crc kubenswrapper[4773]: I0121 15:30:55.206557 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:30:55 crc kubenswrapper[4773]: I0121 15:30:55.207504 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:30:55 crc kubenswrapper[4773]: I0121 15:30:55.207574 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:30:55 crc kubenswrapper[4773]: I0121 15:30:55.208587 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f994c8b8ffd7d3d170163ecb5d02f4d4eede87eb14206bde3e120b03784b936f"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:30:55 crc kubenswrapper[4773]: I0121 15:30:55.208660 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://f994c8b8ffd7d3d170163ecb5d02f4d4eede87eb14206bde3e120b03784b936f" gracePeriod=600 Jan 21 15:30:56 crc kubenswrapper[4773]: I0121 15:30:56.162356 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="f994c8b8ffd7d3d170163ecb5d02f4d4eede87eb14206bde3e120b03784b936f" exitCode=0 Jan 21 15:30:56 crc kubenswrapper[4773]: I0121 15:30:56.162530 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"f994c8b8ffd7d3d170163ecb5d02f4d4eede87eb14206bde3e120b03784b936f"} Jan 21 15:30:56 crc kubenswrapper[4773]: I0121 15:30:56.162805 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"29b829ffd2aebda2804aa1f7f5361a11f62cec99fd30907cee90b79e5bde91e6"} Jan 21 15:30:56 crc kubenswrapper[4773]: I0121 15:30:56.162824 4773 scope.go:117] "RemoveContainer" containerID="4eb88b9c84dbf4878030f533120d6ba85b2a0deff3f3f1c12cc4ca2674b53e0b" Jan 21 15:32:55 crc kubenswrapper[4773]: I0121 15:32:55.205632 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:32:55 crc kubenswrapper[4773]: I0121 15:32:55.206241 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:33:25 crc kubenswrapper[4773]: I0121 15:33:25.206315 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:33:25 crc kubenswrapper[4773]: I0121 15:33:25.207067 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:33:55 crc kubenswrapper[4773]: I0121 15:33:55.206674 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:33:55 crc kubenswrapper[4773]: I0121 15:33:55.207270 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:33:55 crc kubenswrapper[4773]: I0121 15:33:55.207324 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:33:55 crc kubenswrapper[4773]: I0121 15:33:55.208010 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29b829ffd2aebda2804aa1f7f5361a11f62cec99fd30907cee90b79e5bde91e6"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:33:55 crc kubenswrapper[4773]: I0121 15:33:55.208079 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://29b829ffd2aebda2804aa1f7f5361a11f62cec99fd30907cee90b79e5bde91e6" gracePeriod=600 Jan 21 15:33:56 crc kubenswrapper[4773]: I0121 15:33:56.229772 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="29b829ffd2aebda2804aa1f7f5361a11f62cec99fd30907cee90b79e5bde91e6" exitCode=0 Jan 21 15:33:56 crc kubenswrapper[4773]: I0121 15:33:56.229871 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"29b829ffd2aebda2804aa1f7f5361a11f62cec99fd30907cee90b79e5bde91e6"} Jan 21 15:33:56 crc kubenswrapper[4773]: I0121 15:33:56.230728 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"dfc6b9d6e0ca76822fffd10744463be0e4910ef4f750ae2e679a88777ee02328"} Jan 21 15:33:56 crc kubenswrapper[4773]: I0121 15:33:56.230757 4773 scope.go:117] "RemoveContainer" containerID="f994c8b8ffd7d3d170163ecb5d02f4d4eede87eb14206bde3e120b03784b936f" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.412788 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv"] Jan 21 15:35:09 crc kubenswrapper[4773]: E0121 15:35:09.413496 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" containerName="registry" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.413510 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" containerName="registry" Jan 21 15:35:09 crc kubenswrapper[4773]: E0121 15:35:09.413535 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1eed7a3-89e0-412b-b931-9e3905a32965" containerName="collect-profiles" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.413540 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1eed7a3-89e0-412b-b931-9e3905a32965" containerName="collect-profiles" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.413626 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f29f37-d4e3-4767-8077-45ffbf1ddd6d" containerName="registry" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.413637 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1eed7a3-89e0-412b-b931-9e3905a32965" containerName="collect-profiles" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.414381 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.416521 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.433580 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv"] Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.534332 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.535221 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rwv\" (UniqueName: \"kubernetes.io/projected/5476c84f-13d2-4ef5-8426-0147b15b4899-kube-api-access-t9rwv\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.535346 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.636597 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.637046 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rwv\" (UniqueName: \"kubernetes.io/projected/5476c84f-13d2-4ef5-8426-0147b15b4899-kube-api-access-t9rwv\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.637162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.637164 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.637370 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.666524 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rwv\" (UniqueName: \"kubernetes.io/projected/5476c84f-13d2-4ef5-8426-0147b15b4899-kube-api-access-t9rwv\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.736899 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:09 crc kubenswrapper[4773]: I0121 15:35:09.942888 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv"] Jan 21 15:35:10 crc kubenswrapper[4773]: I0121 15:35:10.690490 4773 generic.go:334] "Generic (PLEG): container finished" podID="5476c84f-13d2-4ef5-8426-0147b15b4899" containerID="1396b1eb109ce88f0f1b2261b358fec0993515f6a16dd01899f0d019adae9b46" exitCode=0 Jan 21 15:35:10 crc kubenswrapper[4773]: I0121 15:35:10.690557 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" event={"ID":"5476c84f-13d2-4ef5-8426-0147b15b4899","Type":"ContainerDied","Data":"1396b1eb109ce88f0f1b2261b358fec0993515f6a16dd01899f0d019adae9b46"} Jan 21 15:35:10 crc kubenswrapper[4773]: I0121 15:35:10.690935 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" event={"ID":"5476c84f-13d2-4ef5-8426-0147b15b4899","Type":"ContainerStarted","Data":"09a07ff41293eb3a1f91c694a46f6424bd5961471ce66fce390562e522631b56"} Jan 21 15:35:10 crc kubenswrapper[4773]: I0121 15:35:10.693852 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:35:12 crc kubenswrapper[4773]: I0121 15:35:12.705232 4773 generic.go:334] "Generic (PLEG): container finished" podID="5476c84f-13d2-4ef5-8426-0147b15b4899" containerID="ba871184b7954a580650e1df449e66daac17dee1b1cd06d9575d32ca733c00c7" exitCode=0 Jan 21 15:35:12 crc kubenswrapper[4773]: I0121 15:35:12.705342 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" event={"ID":"5476c84f-13d2-4ef5-8426-0147b15b4899","Type":"ContainerDied","Data":"ba871184b7954a580650e1df449e66daac17dee1b1cd06d9575d32ca733c00c7"} Jan 21 15:35:13 crc kubenswrapper[4773]: I0121 15:35:13.712975 4773 generic.go:334] "Generic (PLEG): container finished" podID="5476c84f-13d2-4ef5-8426-0147b15b4899" containerID="d67470c3af30a8667a201536de391cd9e707f64b6d2aea6866d7953d002e39b7" exitCode=0 Jan 21 15:35:13 crc kubenswrapper[4773]: I0121 15:35:13.713059 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" event={"ID":"5476c84f-13d2-4ef5-8426-0147b15b4899","Type":"ContainerDied","Data":"d67470c3af30a8667a201536de391cd9e707f64b6d2aea6866d7953d002e39b7"} Jan 21 15:35:14 crc kubenswrapper[4773]: I0121 15:35:14.949247 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.008654 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-bundle\") pod \"5476c84f-13d2-4ef5-8426-0147b15b4899\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.008742 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-util\") pod \"5476c84f-13d2-4ef5-8426-0147b15b4899\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.008871 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9rwv\" (UniqueName: \"kubernetes.io/projected/5476c84f-13d2-4ef5-8426-0147b15b4899-kube-api-access-t9rwv\") pod \"5476c84f-13d2-4ef5-8426-0147b15b4899\" (UID: \"5476c84f-13d2-4ef5-8426-0147b15b4899\") " Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.010676 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-bundle" (OuterVolumeSpecName: "bundle") pod "5476c84f-13d2-4ef5-8426-0147b15b4899" (UID: "5476c84f-13d2-4ef5-8426-0147b15b4899"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.014672 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5476c84f-13d2-4ef5-8426-0147b15b4899-kube-api-access-t9rwv" (OuterVolumeSpecName: "kube-api-access-t9rwv") pod "5476c84f-13d2-4ef5-8426-0147b15b4899" (UID: "5476c84f-13d2-4ef5-8426-0147b15b4899"). InnerVolumeSpecName "kube-api-access-t9rwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.023541 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-util" (OuterVolumeSpecName: "util") pod "5476c84f-13d2-4ef5-8426-0147b15b4899" (UID: "5476c84f-13d2-4ef5-8426-0147b15b4899"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.110656 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.110772 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5476c84f-13d2-4ef5-8426-0147b15b4899-util\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.110784 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9rwv\" (UniqueName: \"kubernetes.io/projected/5476c84f-13d2-4ef5-8426-0147b15b4899-kube-api-access-t9rwv\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.729342 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" event={"ID":"5476c84f-13d2-4ef5-8426-0147b15b4899","Type":"ContainerDied","Data":"09a07ff41293eb3a1f91c694a46f6424bd5961471ce66fce390562e522631b56"} Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.729687 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09a07ff41293eb3a1f91c694a46f6424bd5961471ce66fce390562e522631b56" Jan 21 15:35:15 crc kubenswrapper[4773]: I0121 15:35:15.729383 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv" Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.583474 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-94hkt"] Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.585087 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovn-controller" containerID="cri-o://725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a" gracePeriod=30 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.585133 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="sbdb" containerID="cri-o://a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f" gracePeriod=30 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.585205 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovn-acl-logging" containerID="cri-o://a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e" gracePeriod=30 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.585105 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="northd" containerID="cri-o://1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701" gracePeriod=30 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.585406 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="kube-rbac-proxy-node" containerID="cri-o://7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160" gracePeriod=30 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.585123 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="nbdb" containerID="cri-o://112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db" gracePeriod=30 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.585238 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61" gracePeriod=30 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.612465 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" containerID="cri-o://f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74" gracePeriod=30 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.758973 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovnkube-controller/2.log" Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.760927 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovn-acl-logging/0.log" Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761372 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovn-controller/0.log" Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761719 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74" exitCode=0 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761738 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db" exitCode=0 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761745 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61" exitCode=0 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761753 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160" exitCode=0 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761759 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e" exitCode=143 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761766 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a" exitCode=143 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761804 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74"} Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761829 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db"} Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761839 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61"} Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761848 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160"} Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761857 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e"} Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761865 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a"} Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.761879 4773 scope.go:117] "RemoveContainer" containerID="40965f9fde5a1bc069ced2d2ccf30e201d46bd5b848bd398319177b50ed628d0" Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.764010 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gc5wj_34d54fdd-eda0-441f-b721-0adecc20a0db/kube-multus/1.log" Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.764347 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gc5wj_34d54fdd-eda0-441f-b721-0adecc20a0db/kube-multus/0.log" Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.764371 4773 generic.go:334] "Generic (PLEG): container finished" podID="34d54fdd-eda0-441f-b721-0adecc20a0db" containerID="85809c36839dac071be64acdad8a32525bf32b4586611e5ff9424305ca3f8e9b" exitCode=2 Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.764400 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gc5wj" event={"ID":"34d54fdd-eda0-441f-b721-0adecc20a0db","Type":"ContainerDied","Data":"85809c36839dac071be64acdad8a32525bf32b4586611e5ff9424305ca3f8e9b"} Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.764819 4773 scope.go:117] "RemoveContainer" containerID="85809c36839dac071be64acdad8a32525bf32b4586611e5ff9424305ca3f8e9b" Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.791647 4773 scope.go:117] "RemoveContainer" containerID="e7ada6fda936f90018e60c58c6e53b6e194f40dedb495a4faec4bbb67c14364b" Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.942416 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovn-acl-logging/0.log" Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.943119 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovn-controller/0.log" Jan 21 15:35:20 crc kubenswrapper[4773]: I0121 15:35:20.943576 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:20.999992 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rm4rp"] Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000258 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5476c84f-13d2-4ef5-8426-0147b15b4899" containerName="pull" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000275 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5476c84f-13d2-4ef5-8426-0147b15b4899" containerName="pull" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000289 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000297 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000305 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="kube-rbac-proxy-node" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000313 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="kube-rbac-proxy-node" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000325 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovn-acl-logging" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000334 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovn-acl-logging" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000344 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000350 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000360 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5476c84f-13d2-4ef5-8426-0147b15b4899" containerName="extract" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000367 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5476c84f-13d2-4ef5-8426-0147b15b4899" containerName="extract" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000381 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="northd" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000388 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="northd" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000400 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000408 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000419 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5476c84f-13d2-4ef5-8426-0147b15b4899" containerName="util" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000426 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5476c84f-13d2-4ef5-8426-0147b15b4899" containerName="util" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000434 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="kubecfg-setup" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000441 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="kubecfg-setup" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000451 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="nbdb" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000458 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="nbdb" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000466 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="sbdb" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000473 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="sbdb" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000481 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000488 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000495 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovn-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000503 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovn-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000621 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000635 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="northd" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000643 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000656 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovn-acl-logging" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000666 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="sbdb" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000675 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovn-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000685 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5476c84f-13d2-4ef5-8426-0147b15b4899" containerName="extract" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000709 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="kube-rbac-proxy-node" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000718 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000725 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000732 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="nbdb" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.000831 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000840 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.000950 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" containerName="ovnkube-controller" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.007243 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.082463 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-bin\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.082722 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-systemd\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.082843 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-var-lib-openvswitch\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.082962 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-script-lib\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083043 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-env-overrides\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083150 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-openvswitch\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083225 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-ovn-kubernetes\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.082612 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083323 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.082954 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083317 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083280 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083299 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-config\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083430 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-log-socket\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083434 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083448 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-etc-openvswitch\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083464 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-log-socket" (OuterVolumeSpecName: "log-socket") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083473 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083482 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083516 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-node-log\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083533 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-ovn\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083556 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d23a5a4-6787-45a5-9664-20318156f46f-ovn-node-metrics-cert\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083570 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-node-log" (OuterVolumeSpecName: "node-log") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083572 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-slash\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083589 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-slash" (OuterVolumeSpecName: "host-slash") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083607 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083609 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9drtp\" (UniqueName: \"kubernetes.io/projected/2d23a5a4-6787-45a5-9664-20318156f46f-kube-api-access-9drtp\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083635 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-netd\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083665 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-netns\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083703 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-systemd-units\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083716 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-kubelet\") pod \"2d23a5a4-6787-45a5-9664-20318156f46f\" (UID: \"2d23a5a4-6787-45a5-9664-20318156f46f\") " Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083839 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-cni-netd\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083882 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-etc-openvswitch\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083910 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/875a7601-14be-436c-9f64-0e93f161e2b2-env-overrides\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083927 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083943 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/875a7601-14be-436c-9f64-0e93f161e2b2-ovnkube-config\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083964 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drdst\" (UniqueName: \"kubernetes.io/projected/875a7601-14be-436c-9f64-0e93f161e2b2-kube-api-access-drdst\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.083987 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-run-systemd\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084009 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/875a7601-14be-436c-9f64-0e93f161e2b2-ovn-node-metrics-cert\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084036 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084048 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-run-openvswitch\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084070 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-kubelet\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084098 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-node-log\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084114 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/875a7601-14be-436c-9f64-0e93f161e2b2-ovnkube-script-lib\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084139 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-run-ovn-kubernetes\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084154 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-var-lib-openvswitch\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084175 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-run-ovn\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084191 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-log-socket\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084208 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-slash\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084240 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-systemd-units\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084255 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-cni-bin\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084238 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084247 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084286 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084259 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084381 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-run-netns\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084519 4773 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084530 4773 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084541 4773 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084550 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084558 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084566 4773 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084574 4773 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084584 4773 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-log-socket\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084592 4773 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084602 4773 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084610 4773 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-node-log\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084621 4773 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.084628 4773 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-slash\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.085432 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.088040 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d23a5a4-6787-45a5-9664-20318156f46f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.088170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d23a5a4-6787-45a5-9664-20318156f46f-kube-api-access-9drtp" (OuterVolumeSpecName: "kube-api-access-9drtp") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "kube-api-access-9drtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.097896 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2d23a5a4-6787-45a5-9664-20318156f46f" (UID: "2d23a5a4-6787-45a5-9664-20318156f46f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185578 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-kubelet\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185643 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-node-log\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185683 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/875a7601-14be-436c-9f64-0e93f161e2b2-ovnkube-script-lib\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185780 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-var-lib-openvswitch\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185801 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-run-ovn-kubernetes\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185823 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-run-ovn\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185841 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-log-socket\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185858 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-slash\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185878 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-cni-bin\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185891 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-systemd-units\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185908 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-run-netns\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185931 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-cni-netd\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185957 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-etc-openvswitch\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185978 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.185993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/875a7601-14be-436c-9f64-0e93f161e2b2-env-overrides\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186008 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/875a7601-14be-436c-9f64-0e93f161e2b2-ovnkube-config\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186023 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drdst\" (UniqueName: \"kubernetes.io/projected/875a7601-14be-436c-9f64-0e93f161e2b2-kube-api-access-drdst\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186039 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-run-systemd\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186056 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/875a7601-14be-436c-9f64-0e93f161e2b2-ovn-node-metrics-cert\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186076 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-run-openvswitch\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186113 4773 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186124 4773 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186134 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d23a5a4-6787-45a5-9664-20318156f46f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186143 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d23a5a4-6787-45a5-9664-20318156f46f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186153 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9drtp\" (UniqueName: \"kubernetes.io/projected/2d23a5a4-6787-45a5-9664-20318156f46f-kube-api-access-9drtp\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186161 4773 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186169 4773 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d23a5a4-6787-45a5-9664-20318156f46f-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186211 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-run-openvswitch\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186244 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-kubelet\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186266 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-node-log\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186835 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/875a7601-14be-436c-9f64-0e93f161e2b2-ovnkube-script-lib\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186874 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-var-lib-openvswitch\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186896 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-run-ovn-kubernetes\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186917 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-run-ovn\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186936 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-log-socket\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186956 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-slash\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186975 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-cni-bin\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.186994 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-systemd-units\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.187063 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-run-netns\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.187084 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-cni-netd\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.187104 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-etc-openvswitch\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.187133 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.187445 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/875a7601-14be-436c-9f64-0e93f161e2b2-env-overrides\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.187849 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/875a7601-14be-436c-9f64-0e93f161e2b2-ovnkube-config\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.188119 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/875a7601-14be-436c-9f64-0e93f161e2b2-run-systemd\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.191140 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/875a7601-14be-436c-9f64-0e93f161e2b2-ovn-node-metrics-cert\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.223257 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drdst\" (UniqueName: \"kubernetes.io/projected/875a7601-14be-436c-9f64-0e93f161e2b2-kube-api-access-drdst\") pod \"ovnkube-node-rm4rp\" (UID: \"875a7601-14be-436c-9f64-0e93f161e2b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.321664 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:21 crc kubenswrapper[4773]: W0121 15:35:21.339712 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod875a7601_14be_436c_9f64_0e93f161e2b2.slice/crio-871384e02356982a8998fd955fe1749b2e4759e715b772acc174466b0bfc6242 WatchSource:0}: Error finding container 871384e02356982a8998fd955fe1749b2e4759e715b772acc174466b0bfc6242: Status 404 returned error can't find the container with id 871384e02356982a8998fd955fe1749b2e4759e715b772acc174466b0bfc6242 Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.773404 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovn-acl-logging/0.log" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.774878 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-94hkt_2d23a5a4-6787-45a5-9664-20318156f46f/ovn-controller/0.log" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.775283 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f" exitCode=0 Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.775319 4773 generic.go:334] "Generic (PLEG): container finished" podID="2d23a5a4-6787-45a5-9664-20318156f46f" containerID="1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701" exitCode=0 Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.775360 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f"} Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.775413 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701"} Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.775430 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" event={"ID":"2d23a5a4-6787-45a5-9664-20318156f46f","Type":"ContainerDied","Data":"6ad5ea8c47d6b6b6f3e077d79aa034c1ff976d8214d7a3c612155defef9eb3a4"} Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.775450 4773 scope.go:117] "RemoveContainer" containerID="f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.775868 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-94hkt" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.777485 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gc5wj_34d54fdd-eda0-441f-b721-0adecc20a0db/kube-multus/1.log" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.777571 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gc5wj" event={"ID":"34d54fdd-eda0-441f-b721-0adecc20a0db","Type":"ContainerStarted","Data":"0e93575e4181466c26bc7ef6a26af281e4721eaf02b08f175757da752f2b19a2"} Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.782310 4773 generic.go:334] "Generic (PLEG): container finished" podID="875a7601-14be-436c-9f64-0e93f161e2b2" containerID="73141d97af8a3e2fb8c518b1b796c24ece8caae7693599b23d8b82cfec55f4bb" exitCode=0 Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.782347 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" event={"ID":"875a7601-14be-436c-9f64-0e93f161e2b2","Type":"ContainerDied","Data":"73141d97af8a3e2fb8c518b1b796c24ece8caae7693599b23d8b82cfec55f4bb"} Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.782370 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" event={"ID":"875a7601-14be-436c-9f64-0e93f161e2b2","Type":"ContainerStarted","Data":"871384e02356982a8998fd955fe1749b2e4759e715b772acc174466b0bfc6242"} Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.789675 4773 scope.go:117] "RemoveContainer" containerID="a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.808593 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-94hkt"] Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.813588 4773 scope.go:117] "RemoveContainer" containerID="112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.815663 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-94hkt"] Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.853831 4773 scope.go:117] "RemoveContainer" containerID="1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.879780 4773 scope.go:117] "RemoveContainer" containerID="1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.894278 4773 scope.go:117] "RemoveContainer" containerID="7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.909841 4773 scope.go:117] "RemoveContainer" containerID="a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.927515 4773 scope.go:117] "RemoveContainer" containerID="725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.956487 4773 scope.go:117] "RemoveContainer" containerID="781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.980878 4773 scope.go:117] "RemoveContainer" containerID="f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.981483 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74\": container with ID starting with f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74 not found: ID does not exist" containerID="f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.981511 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74"} err="failed to get container status \"f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74\": rpc error: code = NotFound desc = could not find container \"f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74\": container with ID starting with f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74 not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.981532 4773 scope.go:117] "RemoveContainer" containerID="a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.981753 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\": container with ID starting with a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f not found: ID does not exist" containerID="a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.981773 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f"} err="failed to get container status \"a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\": rpc error: code = NotFound desc = could not find container \"a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\": container with ID starting with a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.981785 4773 scope.go:117] "RemoveContainer" containerID="112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.982046 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\": container with ID starting with 112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db not found: ID does not exist" containerID="112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.982067 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db"} err="failed to get container status \"112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\": rpc error: code = NotFound desc = could not find container \"112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\": container with ID starting with 112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.982078 4773 scope.go:117] "RemoveContainer" containerID="1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.982287 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\": container with ID starting with 1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701 not found: ID does not exist" containerID="1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.982306 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701"} err="failed to get container status \"1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\": rpc error: code = NotFound desc = could not find container \"1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\": container with ID starting with 1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701 not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.982318 4773 scope.go:117] "RemoveContainer" containerID="1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.982674 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\": container with ID starting with 1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61 not found: ID does not exist" containerID="1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.982706 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61"} err="failed to get container status \"1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\": rpc error: code = NotFound desc = could not find container \"1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\": container with ID starting with 1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61 not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.982725 4773 scope.go:117] "RemoveContainer" containerID="7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.983032 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\": container with ID starting with 7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160 not found: ID does not exist" containerID="7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.983055 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160"} err="failed to get container status \"7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\": rpc error: code = NotFound desc = could not find container \"7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\": container with ID starting with 7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160 not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.983070 4773 scope.go:117] "RemoveContainer" containerID="a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.983553 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\": container with ID starting with a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e not found: ID does not exist" containerID="a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.983595 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e"} err="failed to get container status \"a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\": rpc error: code = NotFound desc = could not find container \"a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\": container with ID starting with a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.983628 4773 scope.go:117] "RemoveContainer" containerID="725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.984125 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\": container with ID starting with 725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a not found: ID does not exist" containerID="725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.984149 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a"} err="failed to get container status \"725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\": rpc error: code = NotFound desc = could not find container \"725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\": container with ID starting with 725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.984162 4773 scope.go:117] "RemoveContainer" containerID="781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8" Jan 21 15:35:21 crc kubenswrapper[4773]: E0121 15:35:21.989781 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\": container with ID starting with 781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8 not found: ID does not exist" containerID="781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.989818 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8"} err="failed to get container status \"781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\": rpc error: code = NotFound desc = could not find container \"781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\": container with ID starting with 781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8 not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.989838 4773 scope.go:117] "RemoveContainer" containerID="f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.990354 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74"} err="failed to get container status \"f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74\": rpc error: code = NotFound desc = could not find container \"f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74\": container with ID starting with f3d0b95f42fbb0bdfb6596b35f9d057b2a6c33cbc6380b4e7704dd2f32b5cc74 not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.990393 4773 scope.go:117] "RemoveContainer" containerID="a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.993975 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f"} err="failed to get container status \"a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\": rpc error: code = NotFound desc = could not find container \"a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f\": container with ID starting with a8fa200ad7090e51c554183eef06f7cf5257e55b4e00a71c64cd1e53577fdf0f not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.994002 4773 scope.go:117] "RemoveContainer" containerID="112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.999130 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db"} err="failed to get container status \"112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\": rpc error: code = NotFound desc = could not find container \"112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db\": container with ID starting with 112ac240ffe18fc20199fb9c738ca427c06ad5f8c60a05e11adb14a8017521db not found: ID does not exist" Jan 21 15:35:21 crc kubenswrapper[4773]: I0121 15:35:21.999172 4773 scope.go:117] "RemoveContainer" containerID="1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.002950 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701"} err="failed to get container status \"1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\": rpc error: code = NotFound desc = could not find container \"1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701\": container with ID starting with 1c42cb037882a0ff4a29aa1e0486ed772807abe5b78ae84ef9abf044270b1701 not found: ID does not exist" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.002987 4773 scope.go:117] "RemoveContainer" containerID="1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.003388 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61"} err="failed to get container status \"1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\": rpc error: code = NotFound desc = could not find container \"1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61\": container with ID starting with 1e4abcdc269505719849187435b9559faaef235a1ff5bab8bac0dffeb4ce8b61 not found: ID does not exist" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.003421 4773 scope.go:117] "RemoveContainer" containerID="7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.003718 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160"} err="failed to get container status \"7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\": rpc error: code = NotFound desc = could not find container \"7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160\": container with ID starting with 7dfc3e2d134a6fe97369c58c2774b7911df1ff1865d5d336a41d334e2a69f160 not found: ID does not exist" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.003739 4773 scope.go:117] "RemoveContainer" containerID="a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.003947 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e"} err="failed to get container status \"a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\": rpc error: code = NotFound desc = could not find container \"a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e\": container with ID starting with a78039860779ca4be7a5c78de8acf455e80fd29214ffe8b003998312dc2dba8e not found: ID does not exist" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.003965 4773 scope.go:117] "RemoveContainer" containerID="725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.004336 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a"} err="failed to get container status \"725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\": rpc error: code = NotFound desc = could not find container \"725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a\": container with ID starting with 725a40cf552422140c5dbb48a25d8cac7ee73f3036ce5fc25ce7ddd8bf8dd15a not found: ID does not exist" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.004359 4773 scope.go:117] "RemoveContainer" containerID="781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.004612 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8"} err="failed to get container status \"781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\": rpc error: code = NotFound desc = could not find container \"781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8\": container with ID starting with 781e968d6d9ec260a6a0771d373de2d71fb8c97cb3c11748336aaee95a3eb0c8 not found: ID does not exist" Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.789543 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" event={"ID":"875a7601-14be-436c-9f64-0e93f161e2b2","Type":"ContainerStarted","Data":"f49ab30c7b76d3542fb82705b236583bb80a5568d3088a69114b25ca4ebb9ee0"} Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.790062 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" event={"ID":"875a7601-14be-436c-9f64-0e93f161e2b2","Type":"ContainerStarted","Data":"e73eb10235ee9be7f7d3ef0142487f32a940e5d7b2d75800b7319057db4c388c"} Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.790075 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" event={"ID":"875a7601-14be-436c-9f64-0e93f161e2b2","Type":"ContainerStarted","Data":"9c5aeb297d24bfa36955529a1c6e22ab8c649e4f283e008b94dbee5e1a00227f"} Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.790083 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" event={"ID":"875a7601-14be-436c-9f64-0e93f161e2b2","Type":"ContainerStarted","Data":"272b851db71004548d8f695913015a162b3df5ccfb6f7abcac9fa7fa0c230921"} Jan 21 15:35:22 crc kubenswrapper[4773]: I0121 15:35:22.790093 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" event={"ID":"875a7601-14be-436c-9f64-0e93f161e2b2","Type":"ContainerStarted","Data":"3fb9a43c14b75be150127747686a6d555790b0780b41c5acb7cddc54fccb68cb"} Jan 21 15:35:23 crc kubenswrapper[4773]: I0121 15:35:23.391544 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d23a5a4-6787-45a5-9664-20318156f46f" path="/var/lib/kubelet/pods/2d23a5a4-6787-45a5-9664-20318156f46f/volumes" Jan 21 15:35:23 crc kubenswrapper[4773]: I0121 15:35:23.798410 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" event={"ID":"875a7601-14be-436c-9f64-0e93f161e2b2","Type":"ContainerStarted","Data":"1eb1432c16e5a94169a8ea0dd442cbe90332d86284ad44edfd551e8e334439f4"} Jan 21 15:35:25 crc kubenswrapper[4773]: I0121 15:35:25.815709 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" event={"ID":"875a7601-14be-436c-9f64-0e93f161e2b2","Type":"ContainerStarted","Data":"63d53cd40d797f4702587f0c7f685ac054b05ab6149078a4c04930ebc732111f"} Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.041045 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz"] Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.041955 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.043996 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.045504 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-lvfpm" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.045878 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.149555 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26"] Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.150591 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.152296 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-x5289" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.152915 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.154011 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9k94\" (UniqueName: \"kubernetes.io/projected/dbe682b1-6f91-4f6c-a43e-8b2520806e28-kube-api-access-r9k94\") pod \"obo-prometheus-operator-68bc856cb9-lsjgz\" (UID: \"dbe682b1-6f91-4f6c-a43e-8b2520806e28\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.169751 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj"] Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.170468 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.255027 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9k94\" (UniqueName: \"kubernetes.io/projected/dbe682b1-6f91-4f6c-a43e-8b2520806e28-kube-api-access-r9k94\") pod \"obo-prometheus-operator-68bc856cb9-lsjgz\" (UID: \"dbe682b1-6f91-4f6c-a43e-8b2520806e28\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.255087 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2451757-b101-4568-87a5-37a165b4a460-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj\" (UID: \"d2451757-b101-4568-87a5-37a165b4a460\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.255137 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d10d91a-c775-4251-bd46-6034add658e3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-lmk26\" (UID: \"0d10d91a-c775-4251-bd46-6034add658e3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.255167 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d10d91a-c775-4251-bd46-6034add658e3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-lmk26\" (UID: \"0d10d91a-c775-4251-bd46-6034add658e3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.255208 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2451757-b101-4568-87a5-37a165b4a460-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj\" (UID: \"d2451757-b101-4568-87a5-37a165b4a460\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.271444 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9k94\" (UniqueName: \"kubernetes.io/projected/dbe682b1-6f91-4f6c-a43e-8b2520806e28-kube-api-access-r9k94\") pod \"obo-prometheus-operator-68bc856cb9-lsjgz\" (UID: \"dbe682b1-6f91-4f6c-a43e-8b2520806e28\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.356641 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d10d91a-c775-4251-bd46-6034add658e3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-lmk26\" (UID: \"0d10d91a-c775-4251-bd46-6034add658e3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.356715 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d10d91a-c775-4251-bd46-6034add658e3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-lmk26\" (UID: \"0d10d91a-c775-4251-bd46-6034add658e3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.356761 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2451757-b101-4568-87a5-37a165b4a460-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj\" (UID: \"d2451757-b101-4568-87a5-37a165b4a460\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.356804 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2451757-b101-4568-87a5-37a165b4a460-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj\" (UID: \"d2451757-b101-4568-87a5-37a165b4a460\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.361250 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d10d91a-c775-4251-bd46-6034add658e3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-lmk26\" (UID: \"0d10d91a-c775-4251-bd46-6034add658e3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.362497 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.363324 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d10d91a-c775-4251-bd46-6034add658e3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-lmk26\" (UID: \"0d10d91a-c775-4251-bd46-6034add658e3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.364203 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2451757-b101-4568-87a5-37a165b4a460-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj\" (UID: \"d2451757-b101-4568-87a5-37a165b4a460\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.366362 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2451757-b101-4568-87a5-37a165b4a460-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj\" (UID: \"d2451757-b101-4568-87a5-37a165b4a460\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.390206 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-649gj"] Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.390988 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.392812 4773 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators_dbe682b1-6f91-4f6c-a43e-8b2520806e28_0(7abbf6c929fbadeea6c910a2aa0f1b34f8c93010d2ecb6d64fc18fd6ae88aaa1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.392891 4773 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators_dbe682b1-6f91-4f6c-a43e-8b2520806e28_0(7abbf6c929fbadeea6c910a2aa0f1b34f8c93010d2ecb6d64fc18fd6ae88aaa1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.392918 4773 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators_dbe682b1-6f91-4f6c-a43e-8b2520806e28_0(7abbf6c929fbadeea6c910a2aa0f1b34f8c93010d2ecb6d64fc18fd6ae88aaa1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.392979 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators(dbe682b1-6f91-4f6c-a43e-8b2520806e28)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators(dbe682b1-6f91-4f6c-a43e-8b2520806e28)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators_dbe682b1-6f91-4f6c-a43e-8b2520806e28_0(7abbf6c929fbadeea6c910a2aa0f1b34f8c93010d2ecb6d64fc18fd6ae88aaa1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" podUID="dbe682b1-6f91-4f6c-a43e-8b2520806e28" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.393902 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.394591 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-k298h" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.457583 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-472hs\" (UniqueName: \"kubernetes.io/projected/831aa084-8756-4fdc-bc57-38400d4a5650-kube-api-access-472hs\") pod \"observability-operator-59bdc8b94-649gj\" (UID: \"831aa084-8756-4fdc-bc57-38400d4a5650\") " pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.457632 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/831aa084-8756-4fdc-bc57-38400d4a5650-observability-operator-tls\") pod \"observability-operator-59bdc8b94-649gj\" (UID: \"831aa084-8756-4fdc-bc57-38400d4a5650\") " pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.494292 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.515168 4773 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators_0d10d91a-c775-4251-bd46-6034add658e3_0(38b1eb2db7e44eb40a6d9d335ee8f1aaa2f6e5aafca1c114726658bfc34ea343): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.515245 4773 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators_0d10d91a-c775-4251-bd46-6034add658e3_0(38b1eb2db7e44eb40a6d9d335ee8f1aaa2f6e5aafca1c114726658bfc34ea343): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.515274 4773 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators_0d10d91a-c775-4251-bd46-6034add658e3_0(38b1eb2db7e44eb40a6d9d335ee8f1aaa2f6e5aafca1c114726658bfc34ea343): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.515328 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators(0d10d91a-c775-4251-bd46-6034add658e3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators(0d10d91a-c775-4251-bd46-6034add658e3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators_0d10d91a-c775-4251-bd46-6034add658e3_0(38b1eb2db7e44eb40a6d9d335ee8f1aaa2f6e5aafca1c114726658bfc34ea343): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" podUID="0d10d91a-c775-4251-bd46-6034add658e3" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.529600 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.549747 4773 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators_d2451757-b101-4568-87a5-37a165b4a460_0(d5b128beb9edceaab0db3f4ef2da6bdecd35fb99f708b903f0c3148a9186a93f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.549819 4773 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators_d2451757-b101-4568-87a5-37a165b4a460_0(d5b128beb9edceaab0db3f4ef2da6bdecd35fb99f708b903f0c3148a9186a93f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.549839 4773 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators_d2451757-b101-4568-87a5-37a165b4a460_0(d5b128beb9edceaab0db3f4ef2da6bdecd35fb99f708b903f0c3148a9186a93f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.549892 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators(d2451757-b101-4568-87a5-37a165b4a460)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators(d2451757-b101-4568-87a5-37a165b4a460)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators_d2451757-b101-4568-87a5-37a165b4a460_0(d5b128beb9edceaab0db3f4ef2da6bdecd35fb99f708b903f0c3148a9186a93f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" podUID="d2451757-b101-4568-87a5-37a165b4a460" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.559253 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-472hs\" (UniqueName: \"kubernetes.io/projected/831aa084-8756-4fdc-bc57-38400d4a5650-kube-api-access-472hs\") pod \"observability-operator-59bdc8b94-649gj\" (UID: \"831aa084-8756-4fdc-bc57-38400d4a5650\") " pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.559303 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/831aa084-8756-4fdc-bc57-38400d4a5650-observability-operator-tls\") pod \"observability-operator-59bdc8b94-649gj\" (UID: \"831aa084-8756-4fdc-bc57-38400d4a5650\") " pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.564685 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/831aa084-8756-4fdc-bc57-38400d4a5650-observability-operator-tls\") pod \"observability-operator-59bdc8b94-649gj\" (UID: \"831aa084-8756-4fdc-bc57-38400d4a5650\") " pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.575798 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-472hs\" (UniqueName: \"kubernetes.io/projected/831aa084-8756-4fdc-bc57-38400d4a5650-kube-api-access-472hs\") pod \"observability-operator-59bdc8b94-649gj\" (UID: \"831aa084-8756-4fdc-bc57-38400d4a5650\") " pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.586095 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-lnw8m"] Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.586887 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.590780 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-4lz7s" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.660606 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxggz\" (UniqueName: \"kubernetes.io/projected/183b75e6-83ae-40f2-9c03-b2ff4e8959d2-kube-api-access-xxggz\") pod \"perses-operator-5bf474d74f-lnw8m\" (UID: \"183b75e6-83ae-40f2-9c03-b2ff4e8959d2\") " pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.660742 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/183b75e6-83ae-40f2-9c03-b2ff4e8959d2-openshift-service-ca\") pod \"perses-operator-5bf474d74f-lnw8m\" (UID: \"183b75e6-83ae-40f2-9c03-b2ff4e8959d2\") " pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.705645 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.731652 4773 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-649gj_openshift-operators_831aa084-8756-4fdc-bc57-38400d4a5650_0(c0a6e0a304ef69eb1bcbad4d2cdbab6635c6dd02064ab6f468e9780e55fa01d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.731737 4773 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-649gj_openshift-operators_831aa084-8756-4fdc-bc57-38400d4a5650_0(c0a6e0a304ef69eb1bcbad4d2cdbab6635c6dd02064ab6f468e9780e55fa01d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.731761 4773 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-649gj_openshift-operators_831aa084-8756-4fdc-bc57-38400d4a5650_0(c0a6e0a304ef69eb1bcbad4d2cdbab6635c6dd02064ab6f468e9780e55fa01d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.731816 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-649gj_openshift-operators(831aa084-8756-4fdc-bc57-38400d4a5650)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-649gj_openshift-operators(831aa084-8756-4fdc-bc57-38400d4a5650)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-649gj_openshift-operators_831aa084-8756-4fdc-bc57-38400d4a5650_0(c0a6e0a304ef69eb1bcbad4d2cdbab6635c6dd02064ab6f468e9780e55fa01d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-649gj" podUID="831aa084-8756-4fdc-bc57-38400d4a5650" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.762312 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/183b75e6-83ae-40f2-9c03-b2ff4e8959d2-openshift-service-ca\") pod \"perses-operator-5bf474d74f-lnw8m\" (UID: \"183b75e6-83ae-40f2-9c03-b2ff4e8959d2\") " pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.762405 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxggz\" (UniqueName: \"kubernetes.io/projected/183b75e6-83ae-40f2-9c03-b2ff4e8959d2-kube-api-access-xxggz\") pod \"perses-operator-5bf474d74f-lnw8m\" (UID: \"183b75e6-83ae-40f2-9c03-b2ff4e8959d2\") " pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.763380 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/183b75e6-83ae-40f2-9c03-b2ff4e8959d2-openshift-service-ca\") pod \"perses-operator-5bf474d74f-lnw8m\" (UID: \"183b75e6-83ae-40f2-9c03-b2ff4e8959d2\") " pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.784459 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxggz\" (UniqueName: \"kubernetes.io/projected/183b75e6-83ae-40f2-9c03-b2ff4e8959d2-kube-api-access-xxggz\") pod \"perses-operator-5bf474d74f-lnw8m\" (UID: \"183b75e6-83ae-40f2-9c03-b2ff4e8959d2\") " pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.828613 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" event={"ID":"875a7601-14be-436c-9f64-0e93f161e2b2","Type":"ContainerStarted","Data":"6d728bf0781bef0a2b6fe08e0e5df76d6efe60ba516af807da9de5b1716ee4c3"} Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.829073 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.829102 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.829115 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.857842 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" podStartSLOduration=7.857825789 podStartE2EDuration="7.857825789s" podCreationTimestamp="2026-01-21 15:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:35:27.856440563 +0000 UTC m=+692.780930185" watchObservedRunningTime="2026-01-21 15:35:27.857825789 +0000 UTC m=+692.782315411" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.860207 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.881563 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:27 crc kubenswrapper[4773]: I0121 15:35:27.900165 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.933955 4773 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lnw8m_openshift-operators_183b75e6-83ae-40f2-9c03-b2ff4e8959d2_0(6aff4f2f57a1949b203d152166fbee7e0dd4d460a0091aa4dbcbd4dba573de38): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.934018 4773 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lnw8m_openshift-operators_183b75e6-83ae-40f2-9c03-b2ff4e8959d2_0(6aff4f2f57a1949b203d152166fbee7e0dd4d460a0091aa4dbcbd4dba573de38): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.934047 4773 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lnw8m_openshift-operators_183b75e6-83ae-40f2-9c03-b2ff4e8959d2_0(6aff4f2f57a1949b203d152166fbee7e0dd4d460a0091aa4dbcbd4dba573de38): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:27 crc kubenswrapper[4773]: E0121 15:35:27.934089 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-lnw8m_openshift-operators(183b75e6-83ae-40f2-9c03-b2ff4e8959d2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-lnw8m_openshift-operators(183b75e6-83ae-40f2-9c03-b2ff4e8959d2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lnw8m_openshift-operators_183b75e6-83ae-40f2-9c03-b2ff4e8959d2_0(6aff4f2f57a1949b203d152166fbee7e0dd4d460a0091aa4dbcbd4dba573de38): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" podUID="183b75e6-83ae-40f2-9c03-b2ff4e8959d2" Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.824076 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-lnw8m"] Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.829208 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz"] Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.829328 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.829680 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.834307 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj"] Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.834426 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.834866 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.836126 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.836637 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.841034 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26"] Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.841153 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.842856 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.856948 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-649gj"] Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.857081 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:28 crc kubenswrapper[4773]: I0121 15:35:28.857563 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.892047 4773 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators_dbe682b1-6f91-4f6c-a43e-8b2520806e28_0(2d39516dd041c701b7ce3a0ba85d17b71d82aa5fead0953955c5a3fc6070c0c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.892107 4773 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators_dbe682b1-6f91-4f6c-a43e-8b2520806e28_0(2d39516dd041c701b7ce3a0ba85d17b71d82aa5fead0953955c5a3fc6070c0c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.892134 4773 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators_dbe682b1-6f91-4f6c-a43e-8b2520806e28_0(2d39516dd041c701b7ce3a0ba85d17b71d82aa5fead0953955c5a3fc6070c0c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.892178 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators(dbe682b1-6f91-4f6c-a43e-8b2520806e28)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators(dbe682b1-6f91-4f6c-a43e-8b2520806e28)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-lsjgz_openshift-operators_dbe682b1-6f91-4f6c-a43e-8b2520806e28_0(2d39516dd041c701b7ce3a0ba85d17b71d82aa5fead0953955c5a3fc6070c0c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" podUID="dbe682b1-6f91-4f6c-a43e-8b2520806e28" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.894920 4773 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lnw8m_openshift-operators_183b75e6-83ae-40f2-9c03-b2ff4e8959d2_0(ef50d9eb1a2d4dd55f8cce8804412fe7309679fca9c0abb667d117c3e790332c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.894975 4773 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lnw8m_openshift-operators_183b75e6-83ae-40f2-9c03-b2ff4e8959d2_0(ef50d9eb1a2d4dd55f8cce8804412fe7309679fca9c0abb667d117c3e790332c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.894996 4773 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lnw8m_openshift-operators_183b75e6-83ae-40f2-9c03-b2ff4e8959d2_0(ef50d9eb1a2d4dd55f8cce8804412fe7309679fca9c0abb667d117c3e790332c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.895041 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-lnw8m_openshift-operators(183b75e6-83ae-40f2-9c03-b2ff4e8959d2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-lnw8m_openshift-operators(183b75e6-83ae-40f2-9c03-b2ff4e8959d2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lnw8m_openshift-operators_183b75e6-83ae-40f2-9c03-b2ff4e8959d2_0(ef50d9eb1a2d4dd55f8cce8804412fe7309679fca9c0abb667d117c3e790332c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" podUID="183b75e6-83ae-40f2-9c03-b2ff4e8959d2" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.900935 4773 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-649gj_openshift-operators_831aa084-8756-4fdc-bc57-38400d4a5650_0(999b25120ce4630e37f6621f6f396e843a1c36f5984a06f5cb47d499799428e4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.901007 4773 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-649gj_openshift-operators_831aa084-8756-4fdc-bc57-38400d4a5650_0(999b25120ce4630e37f6621f6f396e843a1c36f5984a06f5cb47d499799428e4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.901035 4773 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-649gj_openshift-operators_831aa084-8756-4fdc-bc57-38400d4a5650_0(999b25120ce4630e37f6621f6f396e843a1c36f5984a06f5cb47d499799428e4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.901088 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-649gj_openshift-operators(831aa084-8756-4fdc-bc57-38400d4a5650)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-649gj_openshift-operators(831aa084-8756-4fdc-bc57-38400d4a5650)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-649gj_openshift-operators_831aa084-8756-4fdc-bc57-38400d4a5650_0(999b25120ce4630e37f6621f6f396e843a1c36f5984a06f5cb47d499799428e4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-649gj" podUID="831aa084-8756-4fdc-bc57-38400d4a5650" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.909905 4773 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators_d2451757-b101-4568-87a5-37a165b4a460_0(ea2d5fbf0d6f8390e6b40ad491cf87631bc1b7460778550b4db4a9d6fdfdfdf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.909976 4773 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators_d2451757-b101-4568-87a5-37a165b4a460_0(ea2d5fbf0d6f8390e6b40ad491cf87631bc1b7460778550b4db4a9d6fdfdfdf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.910002 4773 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators_d2451757-b101-4568-87a5-37a165b4a460_0(ea2d5fbf0d6f8390e6b40ad491cf87631bc1b7460778550b4db4a9d6fdfdfdf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.910071 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators(d2451757-b101-4568-87a5-37a165b4a460)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators(d2451757-b101-4568-87a5-37a165b4a460)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_openshift-operators_d2451757-b101-4568-87a5-37a165b4a460_0(ea2d5fbf0d6f8390e6b40ad491cf87631bc1b7460778550b4db4a9d6fdfdfdf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" podUID="d2451757-b101-4568-87a5-37a165b4a460" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.913905 4773 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators_0d10d91a-c775-4251-bd46-6034add658e3_0(e5fc26f3da85eacb5ef9982a4ed823155db220e8ad6545a788e608bbb1ff4fbe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.913968 4773 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators_0d10d91a-c775-4251-bd46-6034add658e3_0(e5fc26f3da85eacb5ef9982a4ed823155db220e8ad6545a788e608bbb1ff4fbe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.913996 4773 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators_0d10d91a-c775-4251-bd46-6034add658e3_0(e5fc26f3da85eacb5ef9982a4ed823155db220e8ad6545a788e608bbb1ff4fbe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:28 crc kubenswrapper[4773]: E0121 15:35:28.914048 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators(0d10d91a-c775-4251-bd46-6034add658e3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators(0d10d91a-c775-4251-bd46-6034add658e3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_openshift-operators_0d10d91a-c775-4251-bd46-6034add658e3_0(e5fc26f3da85eacb5ef9982a4ed823155db220e8ad6545a788e608bbb1ff4fbe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" podUID="0d10d91a-c775-4251-bd46-6034add658e3" Jan 21 15:35:38 crc kubenswrapper[4773]: I0121 15:35:38.460954 4773 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 15:35:39 crc kubenswrapper[4773]: I0121 15:35:39.383618 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:39 crc kubenswrapper[4773]: I0121 15:35:39.384494 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:39 crc kubenswrapper[4773]: I0121 15:35:39.685104 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-649gj"] Jan 21 15:35:39 crc kubenswrapper[4773]: W0121 15:35:39.693260 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod831aa084_8756_4fdc_bc57_38400d4a5650.slice/crio-ae53027ebba341705749ea0fb0c47b27eda84a46f1f1c2f804b3ccb6e168bb75 WatchSource:0}: Error finding container ae53027ebba341705749ea0fb0c47b27eda84a46f1f1c2f804b3ccb6e168bb75: Status 404 returned error can't find the container with id ae53027ebba341705749ea0fb0c47b27eda84a46f1f1c2f804b3ccb6e168bb75 Jan 21 15:35:39 crc kubenswrapper[4773]: I0121 15:35:39.901554 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-649gj" event={"ID":"831aa084-8756-4fdc-bc57-38400d4a5650","Type":"ContainerStarted","Data":"ae53027ebba341705749ea0fb0c47b27eda84a46f1f1c2f804b3ccb6e168bb75"} Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.382957 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.383819 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.384220 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.384431 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.384632 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.384851 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.747240 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26"] Jan 21 15:35:42 crc kubenswrapper[4773]: W0121 15:35:42.764713 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d10d91a_c775_4251_bd46_6034add658e3.slice/crio-e1fa5734569578bc5fb12db8948e1eb50c7d4ec22ce9d49dc641565c70c0658f WatchSource:0}: Error finding container e1fa5734569578bc5fb12db8948e1eb50c7d4ec22ce9d49dc641565c70c0658f: Status 404 returned error can't find the container with id e1fa5734569578bc5fb12db8948e1eb50c7d4ec22ce9d49dc641565c70c0658f Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.800247 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz"] Jan 21 15:35:42 crc kubenswrapper[4773]: W0121 15:35:42.814541 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbe682b1_6f91_4f6c_a43e_8b2520806e28.slice/crio-3dde47ae222c61ec9736d9737fc2c247f03c69afb88b18f10f674523d50625cf WatchSource:0}: Error finding container 3dde47ae222c61ec9736d9737fc2c247f03c69afb88b18f10f674523d50625cf: Status 404 returned error can't find the container with id 3dde47ae222c61ec9736d9737fc2c247f03c69afb88b18f10f674523d50625cf Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.844658 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj"] Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.919848 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" event={"ID":"0d10d91a-c775-4251-bd46-6034add658e3","Type":"ContainerStarted","Data":"e1fa5734569578bc5fb12db8948e1eb50c7d4ec22ce9d49dc641565c70c0658f"} Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.921924 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" event={"ID":"dbe682b1-6f91-4f6c-a43e-8b2520806e28","Type":"ContainerStarted","Data":"3dde47ae222c61ec9736d9737fc2c247f03c69afb88b18f10f674523d50625cf"} Jan 21 15:35:42 crc kubenswrapper[4773]: I0121 15:35:42.923389 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" event={"ID":"d2451757-b101-4568-87a5-37a165b4a460","Type":"ContainerStarted","Data":"90f7e0b0ad70c8682ff3967051f05103b9d2e16f6a3ef305e25921708af69fdc"} Jan 21 15:35:44 crc kubenswrapper[4773]: I0121 15:35:44.382943 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:44 crc kubenswrapper[4773]: I0121 15:35:44.383464 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:47 crc kubenswrapper[4773]: I0121 15:35:47.637355 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-lnw8m"] Jan 21 15:35:47 crc kubenswrapper[4773]: I0121 15:35:47.960546 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" event={"ID":"183b75e6-83ae-40f2-9c03-b2ff4e8959d2","Type":"ContainerStarted","Data":"437ee0196c8ce5e5dced9bb05b05f63c6f6458ef3f7eaf1a092a731dc2475538"} Jan 21 15:35:49 crc kubenswrapper[4773]: I0121 15:35:49.972593 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" event={"ID":"0d10d91a-c775-4251-bd46-6034add658e3","Type":"ContainerStarted","Data":"216fc603ca118b83051b4a3869220f468413eaf04635c206e69b276c5c47795c"} Jan 21 15:35:49 crc kubenswrapper[4773]: I0121 15:35:49.976857 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-649gj" event={"ID":"831aa084-8756-4fdc-bc57-38400d4a5650","Type":"ContainerStarted","Data":"1312f65c887e795f4701fda7e0ecd4aab21ec8657318912ee2208ba9ab68ded5"} Jan 21 15:35:49 crc kubenswrapper[4773]: I0121 15:35:49.977057 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:49 crc kubenswrapper[4773]: I0121 15:35:49.978601 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" event={"ID":"dbe682b1-6f91-4f6c-a43e-8b2520806e28","Type":"ContainerStarted","Data":"795a4a1e8a2144660d56238dde9dc4f8accd3c632356d60058fb8b7abcb8280a"} Jan 21 15:35:49 crc kubenswrapper[4773]: I0121 15:35:49.980426 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" event={"ID":"d2451757-b101-4568-87a5-37a165b4a460","Type":"ContainerStarted","Data":"ce794a36b0e15e3c92596a263dd19221f669c55711d0fe003ad36fce23b7dcf4"} Jan 21 15:35:49 crc kubenswrapper[4773]: I0121 15:35:49.993381 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-lmk26" podStartSLOduration=16.749514099 podStartE2EDuration="22.993362248s" podCreationTimestamp="2026-01-21 15:35:27 +0000 UTC" firstStartedPulling="2026-01-21 15:35:42.768914806 +0000 UTC m=+707.693404428" lastFinishedPulling="2026-01-21 15:35:49.012762955 +0000 UTC m=+713.937252577" observedRunningTime="2026-01-21 15:35:49.986144345 +0000 UTC m=+714.910633987" watchObservedRunningTime="2026-01-21 15:35:49.993362248 +0000 UTC m=+714.917851870" Jan 21 15:35:50 crc kubenswrapper[4773]: I0121 15:35:50.005428 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-649gj" Jan 21 15:35:50 crc kubenswrapper[4773]: I0121 15:35:50.017142 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj" podStartSLOduration=16.857125574 podStartE2EDuration="23.017117908s" podCreationTimestamp="2026-01-21 15:35:27 +0000 UTC" firstStartedPulling="2026-01-21 15:35:42.85082743 +0000 UTC m=+707.775317052" lastFinishedPulling="2026-01-21 15:35:49.010819764 +0000 UTC m=+713.935309386" observedRunningTime="2026-01-21 15:35:50.016223014 +0000 UTC m=+714.940712636" watchObservedRunningTime="2026-01-21 15:35:50.017117908 +0000 UTC m=+714.941607530" Jan 21 15:35:50 crc kubenswrapper[4773]: I0121 15:35:50.058877 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lsjgz" podStartSLOduration=16.872559913 podStartE2EDuration="23.058859715s" podCreationTimestamp="2026-01-21 15:35:27 +0000 UTC" firstStartedPulling="2026-01-21 15:35:42.823365171 +0000 UTC m=+707.747854793" lastFinishedPulling="2026-01-21 15:35:49.009664973 +0000 UTC m=+713.934154595" observedRunningTime="2026-01-21 15:35:50.058391933 +0000 UTC m=+714.982881555" watchObservedRunningTime="2026-01-21 15:35:50.058859715 +0000 UTC m=+714.983349337" Jan 21 15:35:50 crc kubenswrapper[4773]: I0121 15:35:50.060555 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-649gj" podStartSLOduration=13.779850634 podStartE2EDuration="23.06054872s" podCreationTimestamp="2026-01-21 15:35:27 +0000 UTC" firstStartedPulling="2026-01-21 15:35:39.695512439 +0000 UTC m=+704.620002061" lastFinishedPulling="2026-01-21 15:35:48.976210525 +0000 UTC m=+713.900700147" observedRunningTime="2026-01-21 15:35:50.042307556 +0000 UTC m=+714.966797208" watchObservedRunningTime="2026-01-21 15:35:50.06054872 +0000 UTC m=+714.985038332" Jan 21 15:35:51 crc kubenswrapper[4773]: I0121 15:35:51.342533 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rm4rp" Jan 21 15:35:51 crc kubenswrapper[4773]: I0121 15:35:51.992297 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" event={"ID":"183b75e6-83ae-40f2-9c03-b2ff4e8959d2","Type":"ContainerStarted","Data":"6e795e41198b4e812ef76770d3ebb0a82514989049247234270284300b552f16"} Jan 21 15:35:52 crc kubenswrapper[4773]: I0121 15:35:52.013983 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" podStartSLOduration=21.610674817 podStartE2EDuration="25.013960898s" podCreationTimestamp="2026-01-21 15:35:27 +0000 UTC" firstStartedPulling="2026-01-21 15:35:47.913605178 +0000 UTC m=+712.838094800" lastFinishedPulling="2026-01-21 15:35:51.316891259 +0000 UTC m=+716.241380881" observedRunningTime="2026-01-21 15:35:52.013328131 +0000 UTC m=+716.937817753" watchObservedRunningTime="2026-01-21 15:35:52.013960898 +0000 UTC m=+716.938450520" Jan 21 15:35:52 crc kubenswrapper[4773]: I0121 15:35:52.996896 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:55 crc kubenswrapper[4773]: I0121 15:35:55.206136 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:35:55 crc kubenswrapper[4773]: I0121 15:35:55.206650 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:35:57 crc kubenswrapper[4773]: I0121 15:35:57.902747 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-lnw8m" Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.883709 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ftfc6"] Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.884651 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ftfc6" Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.888276 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-nckrx" Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.888498 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.888644 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.904349 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ftfc6"] Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.912563 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-nwxrc"] Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.913555 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-nwxrc" Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.916267 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pzl6p" Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.927204 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-lsnb5"] Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.927905 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-lsnb5" Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.929621 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-c8r7q" Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.930417 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-nwxrc"] Jan 21 15:35:58 crc kubenswrapper[4773]: I0121 15:35:58.946160 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-lsnb5"] Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.003630 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttxfb\" (UniqueName: \"kubernetes.io/projected/d928eb9a-b6dc-4248-9844-54eab0a907fa-kube-api-access-ttxfb\") pod \"cert-manager-858654f9db-nwxrc\" (UID: \"d928eb9a-b6dc-4248-9844-54eab0a907fa\") " pod="cert-manager/cert-manager-858654f9db-nwxrc" Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.003813 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-622m7\" (UniqueName: \"kubernetes.io/projected/f7716ced-8d86-4afa-847b-10feff07e324-kube-api-access-622m7\") pod \"cert-manager-cainjector-cf98fcc89-ftfc6\" (UID: \"f7716ced-8d86-4afa-847b-10feff07e324\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ftfc6" Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.104949 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-622m7\" (UniqueName: \"kubernetes.io/projected/f7716ced-8d86-4afa-847b-10feff07e324-kube-api-access-622m7\") pod \"cert-manager-cainjector-cf98fcc89-ftfc6\" (UID: \"f7716ced-8d86-4afa-847b-10feff07e324\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ftfc6" Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.105505 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfb8r\" (UniqueName: \"kubernetes.io/projected/6f1b2b84-a0ef-43f5-987e-3960271487b8-kube-api-access-zfb8r\") pod \"cert-manager-webhook-687f57d79b-lsnb5\" (UID: \"6f1b2b84-a0ef-43f5-987e-3960271487b8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-lsnb5" Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.105601 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttxfb\" (UniqueName: \"kubernetes.io/projected/d928eb9a-b6dc-4248-9844-54eab0a907fa-kube-api-access-ttxfb\") pod \"cert-manager-858654f9db-nwxrc\" (UID: \"d928eb9a-b6dc-4248-9844-54eab0a907fa\") " pod="cert-manager/cert-manager-858654f9db-nwxrc" Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.122367 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttxfb\" (UniqueName: \"kubernetes.io/projected/d928eb9a-b6dc-4248-9844-54eab0a907fa-kube-api-access-ttxfb\") pod \"cert-manager-858654f9db-nwxrc\" (UID: \"d928eb9a-b6dc-4248-9844-54eab0a907fa\") " pod="cert-manager/cert-manager-858654f9db-nwxrc" Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.122402 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-622m7\" (UniqueName: \"kubernetes.io/projected/f7716ced-8d86-4afa-847b-10feff07e324-kube-api-access-622m7\") pod \"cert-manager-cainjector-cf98fcc89-ftfc6\" (UID: \"f7716ced-8d86-4afa-847b-10feff07e324\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ftfc6" Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.203210 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ftfc6" Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.206572 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfb8r\" (UniqueName: \"kubernetes.io/projected/6f1b2b84-a0ef-43f5-987e-3960271487b8-kube-api-access-zfb8r\") pod \"cert-manager-webhook-687f57d79b-lsnb5\" (UID: \"6f1b2b84-a0ef-43f5-987e-3960271487b8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-lsnb5" Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.223656 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfb8r\" (UniqueName: \"kubernetes.io/projected/6f1b2b84-a0ef-43f5-987e-3960271487b8-kube-api-access-zfb8r\") pod \"cert-manager-webhook-687f57d79b-lsnb5\" (UID: \"6f1b2b84-a0ef-43f5-987e-3960271487b8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-lsnb5" Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.227149 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-nwxrc" Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.240606 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-lsnb5" Jan 21 15:35:59 crc kubenswrapper[4773]: W0121 15:35:59.509993 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd928eb9a_b6dc_4248_9844_54eab0a907fa.slice/crio-43bb84ca8362b33d90f398d0ca44af7fe60287e37f4c1fb72dc10a86ea782626 WatchSource:0}: Error finding container 43bb84ca8362b33d90f398d0ca44af7fe60287e37f4c1fb72dc10a86ea782626: Status 404 returned error can't find the container with id 43bb84ca8362b33d90f398d0ca44af7fe60287e37f4c1fb72dc10a86ea782626 Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.519098 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-nwxrc"] Jan 21 15:35:59 crc kubenswrapper[4773]: W0121 15:35:59.664935 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7716ced_8d86_4afa_847b_10feff07e324.slice/crio-0a285a709eeff4bc8222b48c5dc9740021ad736911ed254f3ccd5f6de5867628 WatchSource:0}: Error finding container 0a285a709eeff4bc8222b48c5dc9740021ad736911ed254f3ccd5f6de5867628: Status 404 returned error can't find the container with id 0a285a709eeff4bc8222b48c5dc9740021ad736911ed254f3ccd5f6de5867628 Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.667526 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ftfc6"] Jan 21 15:35:59 crc kubenswrapper[4773]: I0121 15:35:59.671952 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-lsnb5"] Jan 21 15:36:00 crc kubenswrapper[4773]: I0121 15:36:00.035783 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-lsnb5" event={"ID":"6f1b2b84-a0ef-43f5-987e-3960271487b8","Type":"ContainerStarted","Data":"ed92e60ed8248a4173b118c8e64a318c2f64850b8646ec165ac3077553b0e5d7"} Jan 21 15:36:00 crc kubenswrapper[4773]: I0121 15:36:00.037118 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-nwxrc" event={"ID":"d928eb9a-b6dc-4248-9844-54eab0a907fa","Type":"ContainerStarted","Data":"43bb84ca8362b33d90f398d0ca44af7fe60287e37f4c1fb72dc10a86ea782626"} Jan 21 15:36:00 crc kubenswrapper[4773]: I0121 15:36:00.038156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ftfc6" event={"ID":"f7716ced-8d86-4afa-847b-10feff07e324","Type":"ContainerStarted","Data":"0a285a709eeff4bc8222b48c5dc9740021ad736911ed254f3ccd5f6de5867628"} Jan 21 15:36:05 crc kubenswrapper[4773]: I0121 15:36:05.069151 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-nwxrc" event={"ID":"d928eb9a-b6dc-4248-9844-54eab0a907fa","Type":"ContainerStarted","Data":"1860a9ccbfe50c7b5ce29a3d089768a56f898bb2ab7e8cfe718e2f2cc6d34a17"} Jan 21 15:36:05 crc kubenswrapper[4773]: I0121 15:36:05.070417 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ftfc6" event={"ID":"f7716ced-8d86-4afa-847b-10feff07e324","Type":"ContainerStarted","Data":"7710357d9b20ffb39ff3090a7125745978828af705f207560d8c77dd1c369705"} Jan 21 15:36:05 crc kubenswrapper[4773]: I0121 15:36:05.083204 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-nwxrc" podStartSLOduration=2.5104254470000003 podStartE2EDuration="7.083183492s" podCreationTimestamp="2026-01-21 15:35:58 +0000 UTC" firstStartedPulling="2026-01-21 15:35:59.513753477 +0000 UTC m=+724.438243099" lastFinishedPulling="2026-01-21 15:36:04.086511522 +0000 UTC m=+729.011001144" observedRunningTime="2026-01-21 15:36:05.082515853 +0000 UTC m=+730.007005475" watchObservedRunningTime="2026-01-21 15:36:05.083183492 +0000 UTC m=+730.007673124" Jan 21 15:36:05 crc kubenswrapper[4773]: I0121 15:36:05.149305 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ftfc6" podStartSLOduration=2.7287615240000003 podStartE2EDuration="7.149267529s" podCreationTimestamp="2026-01-21 15:35:58 +0000 UTC" firstStartedPulling="2026-01-21 15:35:59.666884871 +0000 UTC m=+724.591374493" lastFinishedPulling="2026-01-21 15:36:04.087390876 +0000 UTC m=+729.011880498" observedRunningTime="2026-01-21 15:36:05.096613747 +0000 UTC m=+730.021103369" watchObservedRunningTime="2026-01-21 15:36:05.149267529 +0000 UTC m=+730.073757161" Jan 21 15:36:06 crc kubenswrapper[4773]: I0121 15:36:06.076878 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-lsnb5" event={"ID":"6f1b2b84-a0ef-43f5-987e-3960271487b8","Type":"ContainerStarted","Data":"ef7afa12d046c18d827aef4f663f937ef7ab6290d12cdbd22e60f4b1e4aa596b"} Jan 21 15:36:06 crc kubenswrapper[4773]: I0121 15:36:06.093579 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-lsnb5" podStartSLOduration=2.254988701 podStartE2EDuration="8.093556984s" podCreationTimestamp="2026-01-21 15:35:58 +0000 UTC" firstStartedPulling="2026-01-21 15:35:59.679986447 +0000 UTC m=+724.604476069" lastFinishedPulling="2026-01-21 15:36:05.51855473 +0000 UTC m=+730.443044352" observedRunningTime="2026-01-21 15:36:06.089916805 +0000 UTC m=+731.014406437" watchObservedRunningTime="2026-01-21 15:36:06.093556984 +0000 UTC m=+731.018046616" Jan 21 15:36:07 crc kubenswrapper[4773]: I0121 15:36:07.081173 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-lsnb5" Jan 21 15:36:14 crc kubenswrapper[4773]: I0121 15:36:14.243216 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-lsnb5" Jan 21 15:36:25 crc kubenswrapper[4773]: I0121 15:36:25.205985 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:36:25 crc kubenswrapper[4773]: I0121 15:36:25.206540 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.502062 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw"] Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.503651 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.505978 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.516236 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw"] Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.571785 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-util\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.571853 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-bundle\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.571880 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knn6d\" (UniqueName: \"kubernetes.io/projected/3c3dbffb-bd58-407b-8b8f-b968770973dd-kube-api-access-knn6d\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.672748 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-bundle\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.672812 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knn6d\" (UniqueName: \"kubernetes.io/projected/3c3dbffb-bd58-407b-8b8f-b968770973dd-kube-api-access-knn6d\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.672897 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-util\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.673321 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-bundle\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.673381 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-util\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.690935 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knn6d\" (UniqueName: \"kubernetes.io/projected/3c3dbffb-bd58-407b-8b8f-b968770973dd-kube-api-access-knn6d\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:47 crc kubenswrapper[4773]: I0121 15:36:47.818505 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:48 crc kubenswrapper[4773]: I0121 15:36:48.138670 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw"] Jan 21 15:36:48 crc kubenswrapper[4773]: I0121 15:36:48.333941 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" event={"ID":"3c3dbffb-bd58-407b-8b8f-b968770973dd","Type":"ContainerStarted","Data":"b2f1ee5a0b6e6bcf1cc1494e678782b31e834b916a3f14bbd624b653d6c20ee9"} Jan 21 15:36:49 crc kubenswrapper[4773]: I0121 15:36:49.821307 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zsplh"] Jan 21 15:36:49 crc kubenswrapper[4773]: I0121 15:36:49.822890 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:36:49 crc kubenswrapper[4773]: I0121 15:36:49.831478 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zsplh"] Jan 21 15:36:49 crc kubenswrapper[4773]: I0121 15:36:49.901729 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-catalog-content\") pod \"redhat-operators-zsplh\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:36:49 crc kubenswrapper[4773]: I0121 15:36:49.901792 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-utilities\") pod \"redhat-operators-zsplh\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:36:49 crc kubenswrapper[4773]: I0121 15:36:49.901841 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnl29\" (UniqueName: \"kubernetes.io/projected/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-kube-api-access-vnl29\") pod \"redhat-operators-zsplh\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.003234 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnl29\" (UniqueName: \"kubernetes.io/projected/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-kube-api-access-vnl29\") pod \"redhat-operators-zsplh\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.003329 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-catalog-content\") pod \"redhat-operators-zsplh\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.003362 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-utilities\") pod \"redhat-operators-zsplh\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.003778 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-catalog-content\") pod \"redhat-operators-zsplh\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.003810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-utilities\") pod \"redhat-operators-zsplh\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.022285 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnl29\" (UniqueName: \"kubernetes.io/projected/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-kube-api-access-vnl29\") pod \"redhat-operators-zsplh\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.137931 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.173788 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.174650 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.177396 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.177440 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.178329 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.307181 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f41b2b3-660e-4410-95c3-ff61d58f63b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f41b2b3-660e-4410-95c3-ff61d58f63b1\") pod \"minio\" (UID: \"cd00b3d3-ce72-49dc-ac53-92122035b1d2\") " pod="minio-dev/minio" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.307586 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd2bf\" (UniqueName: \"kubernetes.io/projected/cd00b3d3-ce72-49dc-ac53-92122035b1d2-kube-api-access-nd2bf\") pod \"minio\" (UID: \"cd00b3d3-ce72-49dc-ac53-92122035b1d2\") " pod="minio-dev/minio" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.346975 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zsplh"] Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.349920 4773 generic.go:334] "Generic (PLEG): container finished" podID="3c3dbffb-bd58-407b-8b8f-b968770973dd" containerID="3e7202c50ff149f771cdff0692bb7c1aee66d8622ce42596c574ccbefa7abe23" exitCode=0 Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.349974 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" event={"ID":"3c3dbffb-bd58-407b-8b8f-b968770973dd","Type":"ContainerDied","Data":"3e7202c50ff149f771cdff0692bb7c1aee66d8622ce42596c574ccbefa7abe23"} Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.410450 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f41b2b3-660e-4410-95c3-ff61d58f63b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f41b2b3-660e-4410-95c3-ff61d58f63b1\") pod \"minio\" (UID: \"cd00b3d3-ce72-49dc-ac53-92122035b1d2\") " pod="minio-dev/minio" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.410502 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd2bf\" (UniqueName: \"kubernetes.io/projected/cd00b3d3-ce72-49dc-ac53-92122035b1d2-kube-api-access-nd2bf\") pod \"minio\" (UID: \"cd00b3d3-ce72-49dc-ac53-92122035b1d2\") " pod="minio-dev/minio" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.414868 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.414921 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f41b2b3-660e-4410-95c3-ff61d58f63b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f41b2b3-660e-4410-95c3-ff61d58f63b1\") pod \"minio\" (UID: \"cd00b3d3-ce72-49dc-ac53-92122035b1d2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/53c81a8d8a25f0a498d4c38bc6e23846aa8e23e254abf48cbcfdb005ae0cbbd3/globalmount\"" pod="minio-dev/minio" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.435365 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd2bf\" (UniqueName: \"kubernetes.io/projected/cd00b3d3-ce72-49dc-ac53-92122035b1d2-kube-api-access-nd2bf\") pod \"minio\" (UID: \"cd00b3d3-ce72-49dc-ac53-92122035b1d2\") " pod="minio-dev/minio" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.453677 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f41b2b3-660e-4410-95c3-ff61d58f63b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f41b2b3-660e-4410-95c3-ff61d58f63b1\") pod \"minio\" (UID: \"cd00b3d3-ce72-49dc-ac53-92122035b1d2\") " pod="minio-dev/minio" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.508057 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 21 15:36:50 crc kubenswrapper[4773]: I0121 15:36:50.708494 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 21 15:36:51 crc kubenswrapper[4773]: I0121 15:36:51.356597 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"cd00b3d3-ce72-49dc-ac53-92122035b1d2","Type":"ContainerStarted","Data":"8bb04d2f5878a03b54c8118efc913251901eefcaf102e1b8f640e60c01f9c0a9"} Jan 21 15:36:51 crc kubenswrapper[4773]: I0121 15:36:51.357896 4773 generic.go:334] "Generic (PLEG): container finished" podID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerID="b5f2a9be50be98429247e7e2261f1c7e24de523cb4a99fe128e8b231509a66dd" exitCode=0 Jan 21 15:36:51 crc kubenswrapper[4773]: I0121 15:36:51.357937 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsplh" event={"ID":"4da26ac5-96ca-40e8-8990-10bbc5f5b35f","Type":"ContainerDied","Data":"b5f2a9be50be98429247e7e2261f1c7e24de523cb4a99fe128e8b231509a66dd"} Jan 21 15:36:51 crc kubenswrapper[4773]: I0121 15:36:51.357959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsplh" event={"ID":"4da26ac5-96ca-40e8-8990-10bbc5f5b35f","Type":"ContainerStarted","Data":"490760063d4e372f7be0fa89c49b1c59472abc866bdbd2cbc4399e13dcbf0c01"} Jan 21 15:36:55 crc kubenswrapper[4773]: I0121 15:36:55.205923 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:36:55 crc kubenswrapper[4773]: I0121 15:36:55.206295 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:36:55 crc kubenswrapper[4773]: I0121 15:36:55.206359 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:36:55 crc kubenswrapper[4773]: I0121 15:36:55.207166 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dfc6b9d6e0ca76822fffd10744463be0e4910ef4f750ae2e679a88777ee02328"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:36:55 crc kubenswrapper[4773]: I0121 15:36:55.207237 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://dfc6b9d6e0ca76822fffd10744463be0e4910ef4f750ae2e679a88777ee02328" gracePeriod=600 Jan 21 15:36:56 crc kubenswrapper[4773]: I0121 15:36:56.386364 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="dfc6b9d6e0ca76822fffd10744463be0e4910ef4f750ae2e679a88777ee02328" exitCode=0 Jan 21 15:36:56 crc kubenswrapper[4773]: I0121 15:36:56.386418 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"dfc6b9d6e0ca76822fffd10744463be0e4910ef4f750ae2e679a88777ee02328"} Jan 21 15:36:56 crc kubenswrapper[4773]: I0121 15:36:56.386782 4773 scope.go:117] "RemoveContainer" containerID="29b829ffd2aebda2804aa1f7f5361a11f62cec99fd30907cee90b79e5bde91e6" Jan 21 15:36:57 crc kubenswrapper[4773]: I0121 15:36:57.396048 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"ac9fadc09282233e8c4f18266ba6204c80ab33ee79a6058a1eff20ea540a3140"} Jan 21 15:36:57 crc kubenswrapper[4773]: I0121 15:36:57.398338 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"cd00b3d3-ce72-49dc-ac53-92122035b1d2","Type":"ContainerStarted","Data":"402f4ffb974476e042b5af55a284d7e554ac7a36cb61f0df9574cef5b1759066"} Jan 21 15:36:57 crc kubenswrapper[4773]: I0121 15:36:57.400553 4773 generic.go:334] "Generic (PLEG): container finished" podID="3c3dbffb-bd58-407b-8b8f-b968770973dd" containerID="481ca67f5c29d5cf3942da72f503ba4546d8b5d865de38ecae99ed151f6f19d4" exitCode=0 Jan 21 15:36:57 crc kubenswrapper[4773]: I0121 15:36:57.400665 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" event={"ID":"3c3dbffb-bd58-407b-8b8f-b968770973dd","Type":"ContainerDied","Data":"481ca67f5c29d5cf3942da72f503ba4546d8b5d865de38ecae99ed151f6f19d4"} Jan 21 15:36:57 crc kubenswrapper[4773]: I0121 15:36:57.404295 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsplh" event={"ID":"4da26ac5-96ca-40e8-8990-10bbc5f5b35f","Type":"ContainerStarted","Data":"7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8"} Jan 21 15:36:57 crc kubenswrapper[4773]: I0121 15:36:57.475627 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.043487965 podStartE2EDuration="10.475602531s" podCreationTimestamp="2026-01-21 15:36:47 +0000 UTC" firstStartedPulling="2026-01-21 15:36:50.716221972 +0000 UTC m=+775.640711594" lastFinishedPulling="2026-01-21 15:36:57.148336538 +0000 UTC m=+782.072826160" observedRunningTime="2026-01-21 15:36:57.474632444 +0000 UTC m=+782.399122066" watchObservedRunningTime="2026-01-21 15:36:57.475602531 +0000 UTC m=+782.400092143" Jan 21 15:36:58 crc kubenswrapper[4773]: I0121 15:36:58.411654 4773 generic.go:334] "Generic (PLEG): container finished" podID="3c3dbffb-bd58-407b-8b8f-b968770973dd" containerID="5b345a576be950be5667b30239e06082328115198f907a4f2619d5ee263d84c9" exitCode=0 Jan 21 15:36:58 crc kubenswrapper[4773]: I0121 15:36:58.412325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" event={"ID":"3c3dbffb-bd58-407b-8b8f-b968770973dd","Type":"ContainerDied","Data":"5b345a576be950be5667b30239e06082328115198f907a4f2619d5ee263d84c9"} Jan 21 15:36:58 crc kubenswrapper[4773]: I0121 15:36:58.414411 4773 generic.go:334] "Generic (PLEG): container finished" podID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerID="7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8" exitCode=0 Jan 21 15:36:58 crc kubenswrapper[4773]: I0121 15:36:58.415383 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsplh" event={"ID":"4da26ac5-96ca-40e8-8990-10bbc5f5b35f","Type":"ContainerDied","Data":"7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8"} Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.429153 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsplh" event={"ID":"4da26ac5-96ca-40e8-8990-10bbc5f5b35f","Type":"ContainerStarted","Data":"20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6"} Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.461060 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zsplh" podStartSLOduration=2.999802915 podStartE2EDuration="10.461042006s" podCreationTimestamp="2026-01-21 15:36:49 +0000 UTC" firstStartedPulling="2026-01-21 15:36:51.359393519 +0000 UTC m=+776.283883141" lastFinishedPulling="2026-01-21 15:36:58.82063261 +0000 UTC m=+783.745122232" observedRunningTime="2026-01-21 15:36:59.459275618 +0000 UTC m=+784.383765260" watchObservedRunningTime="2026-01-21 15:36:59.461042006 +0000 UTC m=+784.385531638" Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.772657 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.885715 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-util\") pod \"3c3dbffb-bd58-407b-8b8f-b968770973dd\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.886063 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knn6d\" (UniqueName: \"kubernetes.io/projected/3c3dbffb-bd58-407b-8b8f-b968770973dd-kube-api-access-knn6d\") pod \"3c3dbffb-bd58-407b-8b8f-b968770973dd\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.886958 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-bundle\") pod \"3c3dbffb-bd58-407b-8b8f-b968770973dd\" (UID: \"3c3dbffb-bd58-407b-8b8f-b968770973dd\") " Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.887990 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-bundle" (OuterVolumeSpecName: "bundle") pod "3c3dbffb-bd58-407b-8b8f-b968770973dd" (UID: "3c3dbffb-bd58-407b-8b8f-b968770973dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.893101 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c3dbffb-bd58-407b-8b8f-b968770973dd-kube-api-access-knn6d" (OuterVolumeSpecName: "kube-api-access-knn6d") pod "3c3dbffb-bd58-407b-8b8f-b968770973dd" (UID: "3c3dbffb-bd58-407b-8b8f-b968770973dd"). InnerVolumeSpecName "kube-api-access-knn6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.896408 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-util" (OuterVolumeSpecName: "util") pod "3c3dbffb-bd58-407b-8b8f-b968770973dd" (UID: "3c3dbffb-bd58-407b-8b8f-b968770973dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.988721 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-util\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.988985 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knn6d\" (UniqueName: \"kubernetes.io/projected/3c3dbffb-bd58-407b-8b8f-b968770973dd-kube-api-access-knn6d\") on node \"crc\" DevicePath \"\"" Jan 21 15:36:59 crc kubenswrapper[4773]: I0121 15:36:59.989051 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c3dbffb-bd58-407b-8b8f-b968770973dd-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:00 crc kubenswrapper[4773]: I0121 15:37:00.138679 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:37:00 crc kubenswrapper[4773]: I0121 15:37:00.138764 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:37:00 crc kubenswrapper[4773]: I0121 15:37:00.436903 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" event={"ID":"3c3dbffb-bd58-407b-8b8f-b968770973dd","Type":"ContainerDied","Data":"b2f1ee5a0b6e6bcf1cc1494e678782b31e834b916a3f14bbd624b653d6c20ee9"} Jan 21 15:37:00 crc kubenswrapper[4773]: I0121 15:37:00.436974 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2f1ee5a0b6e6bcf1cc1494e678782b31e834b916a3f14bbd624b653d6c20ee9" Jan 21 15:37:00 crc kubenswrapper[4773]: I0121 15:37:00.436980 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw" Jan 21 15:37:01 crc kubenswrapper[4773]: I0121 15:37:01.180605 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zsplh" podUID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerName="registry-server" probeResult="failure" output=< Jan 21 15:37:01 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 15:37:01 crc kubenswrapper[4773]: > Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.767399 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj"] Jan 21 15:37:06 crc kubenswrapper[4773]: E0121 15:37:06.769126 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3dbffb-bd58-407b-8b8f-b968770973dd" containerName="util" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.769189 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3dbffb-bd58-407b-8b8f-b968770973dd" containerName="util" Jan 21 15:37:06 crc kubenswrapper[4773]: E0121 15:37:06.769256 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3dbffb-bd58-407b-8b8f-b968770973dd" containerName="pull" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.769302 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3dbffb-bd58-407b-8b8f-b968770973dd" containerName="pull" Jan 21 15:37:06 crc kubenswrapper[4773]: E0121 15:37:06.769356 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3dbffb-bd58-407b-8b8f-b968770973dd" containerName="extract" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.769399 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3dbffb-bd58-407b-8b8f-b968770973dd" containerName="extract" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.769528 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c3dbffb-bd58-407b-8b8f-b968770973dd" containerName="extract" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.770176 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.773102 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.773450 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.773654 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-wdmv5" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.773834 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.773978 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.776184 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.789535 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj"] Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.873913 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-apiservice-cert\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.873975 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-webhook-cert\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.874006 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbkkk\" (UniqueName: \"kubernetes.io/projected/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-kube-api-access-nbkkk\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.874043 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-manager-config\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.874061 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.974858 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-apiservice-cert\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.974926 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-webhook-cert\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.974955 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbkkk\" (UniqueName: \"kubernetes.io/projected/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-kube-api-access-nbkkk\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.974991 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-manager-config\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.975010 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.977029 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-manager-config\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.981290 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-apiservice-cert\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.988556 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-webhook-cert\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:06 crc kubenswrapper[4773]: I0121 15:37:06.989015 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:07 crc kubenswrapper[4773]: I0121 15:37:07.008391 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbkkk\" (UniqueName: \"kubernetes.io/projected/e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20-kube-api-access-nbkkk\") pod \"loki-operator-controller-manager-55d66b9568-sfgsj\" (UID: \"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:07 crc kubenswrapper[4773]: I0121 15:37:07.087253 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:07 crc kubenswrapper[4773]: I0121 15:37:07.517064 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj"] Jan 21 15:37:07 crc kubenswrapper[4773]: W0121 15:37:07.522740 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode47fdd5c_ea4d_4bf2_a46b_f1a2d7cbfd20.slice/crio-29f022bd33b8c2488549959b7b8dc42b286ffdd235896a00e28db57c63f51af9 WatchSource:0}: Error finding container 29f022bd33b8c2488549959b7b8dc42b286ffdd235896a00e28db57c63f51af9: Status 404 returned error can't find the container with id 29f022bd33b8c2488549959b7b8dc42b286ffdd235896a00e28db57c63f51af9 Jan 21 15:37:08 crc kubenswrapper[4773]: I0121 15:37:08.481826 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" event={"ID":"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20","Type":"ContainerStarted","Data":"29f022bd33b8c2488549959b7b8dc42b286ffdd235896a00e28db57c63f51af9"} Jan 21 15:37:10 crc kubenswrapper[4773]: I0121 15:37:10.173368 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:37:10 crc kubenswrapper[4773]: I0121 15:37:10.208017 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:37:10 crc kubenswrapper[4773]: I0121 15:37:10.929265 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zsplh"] Jan 21 15:37:11 crc kubenswrapper[4773]: I0121 15:37:11.496129 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zsplh" podUID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerName="registry-server" containerID="cri-o://20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6" gracePeriod=2 Jan 21 15:37:11 crc kubenswrapper[4773]: I0121 15:37:11.861682 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.039438 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-utilities\") pod \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.039576 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-catalog-content\") pod \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.039631 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnl29\" (UniqueName: \"kubernetes.io/projected/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-kube-api-access-vnl29\") pod \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\" (UID: \"4da26ac5-96ca-40e8-8990-10bbc5f5b35f\") " Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.040240 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-utilities" (OuterVolumeSpecName: "utilities") pod "4da26ac5-96ca-40e8-8990-10bbc5f5b35f" (UID: "4da26ac5-96ca-40e8-8990-10bbc5f5b35f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.045274 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-kube-api-access-vnl29" (OuterVolumeSpecName: "kube-api-access-vnl29") pod "4da26ac5-96ca-40e8-8990-10bbc5f5b35f" (UID: "4da26ac5-96ca-40e8-8990-10bbc5f5b35f"). InnerVolumeSpecName "kube-api-access-vnl29". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.141573 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnl29\" (UniqueName: \"kubernetes.io/projected/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-kube-api-access-vnl29\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.141611 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.173647 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4da26ac5-96ca-40e8-8990-10bbc5f5b35f" (UID: "4da26ac5-96ca-40e8-8990-10bbc5f5b35f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.243312 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da26ac5-96ca-40e8-8990-10bbc5f5b35f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.505491 4773 generic.go:334] "Generic (PLEG): container finished" podID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerID="20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6" exitCode=0 Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.505536 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsplh" event={"ID":"4da26ac5-96ca-40e8-8990-10bbc5f5b35f","Type":"ContainerDied","Data":"20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6"} Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.505565 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsplh" event={"ID":"4da26ac5-96ca-40e8-8990-10bbc5f5b35f","Type":"ContainerDied","Data":"490760063d4e372f7be0fa89c49b1c59472abc866bdbd2cbc4399e13dcbf0c01"} Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.505564 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zsplh" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.505582 4773 scope.go:117] "RemoveContainer" containerID="20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.523921 4773 scope.go:117] "RemoveContainer" containerID="7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.548632 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zsplh"] Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.554097 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zsplh"] Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.566182 4773 scope.go:117] "RemoveContainer" containerID="b5f2a9be50be98429247e7e2261f1c7e24de523cb4a99fe128e8b231509a66dd" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.582060 4773 scope.go:117] "RemoveContainer" containerID="20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6" Jan 21 15:37:12 crc kubenswrapper[4773]: E0121 15:37:12.582550 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6\": container with ID starting with 20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6 not found: ID does not exist" containerID="20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.582582 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6"} err="failed to get container status \"20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6\": rpc error: code = NotFound desc = could not find container \"20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6\": container with ID starting with 20817d71153e6d882197a3f02aa4d4f7ad8ffccc447c6004853669d6317e1fd6 not found: ID does not exist" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.582604 4773 scope.go:117] "RemoveContainer" containerID="7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8" Jan 21 15:37:12 crc kubenswrapper[4773]: E0121 15:37:12.583088 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8\": container with ID starting with 7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8 not found: ID does not exist" containerID="7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.583111 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8"} err="failed to get container status \"7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8\": rpc error: code = NotFound desc = could not find container \"7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8\": container with ID starting with 7010063b0e448f15fbcaf0557e44e576187ea6116c78a815a53b91e8b155aca8 not found: ID does not exist" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.583125 4773 scope.go:117] "RemoveContainer" containerID="b5f2a9be50be98429247e7e2261f1c7e24de523cb4a99fe128e8b231509a66dd" Jan 21 15:37:12 crc kubenswrapper[4773]: E0121 15:37:12.583387 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5f2a9be50be98429247e7e2261f1c7e24de523cb4a99fe128e8b231509a66dd\": container with ID starting with b5f2a9be50be98429247e7e2261f1c7e24de523cb4a99fe128e8b231509a66dd not found: ID does not exist" containerID="b5f2a9be50be98429247e7e2261f1c7e24de523cb4a99fe128e8b231509a66dd" Jan 21 15:37:12 crc kubenswrapper[4773]: I0121 15:37:12.583429 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f2a9be50be98429247e7e2261f1c7e24de523cb4a99fe128e8b231509a66dd"} err="failed to get container status \"b5f2a9be50be98429247e7e2261f1c7e24de523cb4a99fe128e8b231509a66dd\": rpc error: code = NotFound desc = could not find container \"b5f2a9be50be98429247e7e2261f1c7e24de523cb4a99fe128e8b231509a66dd\": container with ID starting with b5f2a9be50be98429247e7e2261f1c7e24de523cb4a99fe128e8b231509a66dd not found: ID does not exist" Jan 21 15:37:13 crc kubenswrapper[4773]: I0121 15:37:13.417776 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" path="/var/lib/kubelet/pods/4da26ac5-96ca-40e8-8990-10bbc5f5b35f/volumes" Jan 21 15:37:15 crc kubenswrapper[4773]: I0121 15:37:15.523242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" event={"ID":"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20","Type":"ContainerStarted","Data":"78056926acc618baf4eb1e65107b6152d575381c535a2a8a77816e6103ead07e"} Jan 21 15:37:21 crc kubenswrapper[4773]: I0121 15:37:21.565361 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" event={"ID":"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20","Type":"ContainerStarted","Data":"ba70c2f9d42777c01ce79e2b992b5e0e8386697e7c8c98956e6019ffdd091913"} Jan 21 15:37:21 crc kubenswrapper[4773]: I0121 15:37:21.565865 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:21 crc kubenswrapper[4773]: I0121 15:37:21.569198 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 15:37:21 crc kubenswrapper[4773]: I0121 15:37:21.600612 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" podStartSLOduration=2.746421049 podStartE2EDuration="15.600577601s" podCreationTimestamp="2026-01-21 15:37:06 +0000 UTC" firstStartedPulling="2026-01-21 15:37:07.524719392 +0000 UTC m=+792.449209004" lastFinishedPulling="2026-01-21 15:37:20.378875934 +0000 UTC m=+805.303365556" observedRunningTime="2026-01-21 15:37:21.592848548 +0000 UTC m=+806.517338200" watchObservedRunningTime="2026-01-21 15:37:21.600577601 +0000 UTC m=+806.525067263" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.040482 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm"] Jan 21 15:37:46 crc kubenswrapper[4773]: E0121 15:37:46.041257 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerName="registry-server" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.041272 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerName="registry-server" Jan 21 15:37:46 crc kubenswrapper[4773]: E0121 15:37:46.041286 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerName="extract-utilities" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.041294 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerName="extract-utilities" Jan 21 15:37:46 crc kubenswrapper[4773]: E0121 15:37:46.041315 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerName="extract-content" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.041324 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerName="extract-content" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.041439 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4da26ac5-96ca-40e8-8990-10bbc5f5b35f" containerName="registry-server" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.042258 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.044354 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.060601 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm"] Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.193345 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.193409 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.193440 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx5pl\" (UniqueName: \"kubernetes.io/projected/26c1eaa2-3f40-480e-addf-1a97073c381c-kube-api-access-mx5pl\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.294809 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.294866 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.294896 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5pl\" (UniqueName: \"kubernetes.io/projected/26c1eaa2-3f40-480e-addf-1a97073c381c-kube-api-access-mx5pl\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.295507 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.295597 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.317842 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx5pl\" (UniqueName: \"kubernetes.io/projected/26c1eaa2-3f40-480e-addf-1a97073c381c-kube-api-access-mx5pl\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.360582 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.600955 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm"] Jan 21 15:37:46 crc kubenswrapper[4773]: I0121 15:37:46.713610 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" event={"ID":"26c1eaa2-3f40-480e-addf-1a97073c381c","Type":"ContainerStarted","Data":"509049724b246263beec154b0075ddd62cc22546b76f3f1b7a6e607a7fbbe964"} Jan 21 15:37:48 crc kubenswrapper[4773]: I0121 15:37:48.725219 4773 generic.go:334] "Generic (PLEG): container finished" podID="26c1eaa2-3f40-480e-addf-1a97073c381c" containerID="592499a39b69f54dd999fd6d18b70ef9e1aabe69cc6a5bf5f5f3f1f81dadc9c4" exitCode=0 Jan 21 15:37:48 crc kubenswrapper[4773]: I0121 15:37:48.725498 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" event={"ID":"26c1eaa2-3f40-480e-addf-1a97073c381c","Type":"ContainerDied","Data":"592499a39b69f54dd999fd6d18b70ef9e1aabe69cc6a5bf5f5f3f1f81dadc9c4"} Jan 21 15:37:53 crc kubenswrapper[4773]: I0121 15:37:53.766470 4773 generic.go:334] "Generic (PLEG): container finished" podID="26c1eaa2-3f40-480e-addf-1a97073c381c" containerID="4c4124640c08184a877893c36552a81892116661a45f55f5d1313023bbbd88b0" exitCode=0 Jan 21 15:37:53 crc kubenswrapper[4773]: I0121 15:37:53.766601 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" event={"ID":"26c1eaa2-3f40-480e-addf-1a97073c381c","Type":"ContainerDied","Data":"4c4124640c08184a877893c36552a81892116661a45f55f5d1313023bbbd88b0"} Jan 21 15:37:54 crc kubenswrapper[4773]: I0121 15:37:54.774998 4773 generic.go:334] "Generic (PLEG): container finished" podID="26c1eaa2-3f40-480e-addf-1a97073c381c" containerID="3901b84d3cbb5da1a88fd00f565447124cc58aaefece9f4e244cbb2063cfcc29" exitCode=0 Jan 21 15:37:54 crc kubenswrapper[4773]: I0121 15:37:54.775096 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" event={"ID":"26c1eaa2-3f40-480e-addf-1a97073c381c","Type":"ContainerDied","Data":"3901b84d3cbb5da1a88fd00f565447124cc58aaefece9f4e244cbb2063cfcc29"} Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.060395 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.228996 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-util\") pod \"26c1eaa2-3f40-480e-addf-1a97073c381c\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.229321 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx5pl\" (UniqueName: \"kubernetes.io/projected/26c1eaa2-3f40-480e-addf-1a97073c381c-kube-api-access-mx5pl\") pod \"26c1eaa2-3f40-480e-addf-1a97073c381c\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.229403 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-bundle\") pod \"26c1eaa2-3f40-480e-addf-1a97073c381c\" (UID: \"26c1eaa2-3f40-480e-addf-1a97073c381c\") " Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.230035 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-bundle" (OuterVolumeSpecName: "bundle") pod "26c1eaa2-3f40-480e-addf-1a97073c381c" (UID: "26c1eaa2-3f40-480e-addf-1a97073c381c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.234804 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c1eaa2-3f40-480e-addf-1a97073c381c-kube-api-access-mx5pl" (OuterVolumeSpecName: "kube-api-access-mx5pl") pod "26c1eaa2-3f40-480e-addf-1a97073c381c" (UID: "26c1eaa2-3f40-480e-addf-1a97073c381c"). InnerVolumeSpecName "kube-api-access-mx5pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.238909 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-util" (OuterVolumeSpecName: "util") pod "26c1eaa2-3f40-480e-addf-1a97073c381c" (UID: "26c1eaa2-3f40-480e-addf-1a97073c381c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.330757 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-util\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.330790 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c1eaa2-3f40-480e-addf-1a97073c381c-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.330799 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx5pl\" (UniqueName: \"kubernetes.io/projected/26c1eaa2-3f40-480e-addf-1a97073c381c-kube-api-access-mx5pl\") on node \"crc\" DevicePath \"\"" Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.795502 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" event={"ID":"26c1eaa2-3f40-480e-addf-1a97073c381c","Type":"ContainerDied","Data":"509049724b246263beec154b0075ddd62cc22546b76f3f1b7a6e607a7fbbe964"} Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.795550 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="509049724b246263beec154b0075ddd62cc22546b76f3f1b7a6e607a7fbbe964" Jan 21 15:37:56 crc kubenswrapper[4773]: I0121 15:37:56.795925 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.064774 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-vqt8s"] Jan 21 15:38:03 crc kubenswrapper[4773]: E0121 15:38:03.065489 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c1eaa2-3f40-480e-addf-1a97073c381c" containerName="extract" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.065506 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c1eaa2-3f40-480e-addf-1a97073c381c" containerName="extract" Jan 21 15:38:03 crc kubenswrapper[4773]: E0121 15:38:03.065521 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c1eaa2-3f40-480e-addf-1a97073c381c" containerName="util" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.065528 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c1eaa2-3f40-480e-addf-1a97073c381c" containerName="util" Jan 21 15:38:03 crc kubenswrapper[4773]: E0121 15:38:03.065541 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c1eaa2-3f40-480e-addf-1a97073c381c" containerName="pull" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.065547 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c1eaa2-3f40-480e-addf-1a97073c381c" containerName="pull" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.065649 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c1eaa2-3f40-480e-addf-1a97073c381c" containerName="extract" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.066135 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-vqt8s" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.067816 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-789h6" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.068151 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.069562 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.081310 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-vqt8s"] Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.129072 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpnj\" (UniqueName: \"kubernetes.io/projected/bd44c2ca-cc7f-432a-89b8-02a06428b3c9-kube-api-access-twpnj\") pod \"nmstate-operator-646758c888-vqt8s\" (UID: \"bd44c2ca-cc7f-432a-89b8-02a06428b3c9\") " pod="openshift-nmstate/nmstate-operator-646758c888-vqt8s" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.230263 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twpnj\" (UniqueName: \"kubernetes.io/projected/bd44c2ca-cc7f-432a-89b8-02a06428b3c9-kube-api-access-twpnj\") pod \"nmstate-operator-646758c888-vqt8s\" (UID: \"bd44c2ca-cc7f-432a-89b8-02a06428b3c9\") " pod="openshift-nmstate/nmstate-operator-646758c888-vqt8s" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.249632 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpnj\" (UniqueName: \"kubernetes.io/projected/bd44c2ca-cc7f-432a-89b8-02a06428b3c9-kube-api-access-twpnj\") pod \"nmstate-operator-646758c888-vqt8s\" (UID: \"bd44c2ca-cc7f-432a-89b8-02a06428b3c9\") " pod="openshift-nmstate/nmstate-operator-646758c888-vqt8s" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.383285 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-vqt8s" Jan 21 15:38:03 crc kubenswrapper[4773]: I0121 15:38:03.840379 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-vqt8s"] Jan 21 15:38:03 crc kubenswrapper[4773]: W0121 15:38:03.853884 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd44c2ca_cc7f_432a_89b8_02a06428b3c9.slice/crio-a8be24ef607cd5c9c29bfb17a5be51c3a9d1148671efd5fca469ae8c0d6a8b92 WatchSource:0}: Error finding container a8be24ef607cd5c9c29bfb17a5be51c3a9d1148671efd5fca469ae8c0d6a8b92: Status 404 returned error can't find the container with id a8be24ef607cd5c9c29bfb17a5be51c3a9d1148671efd5fca469ae8c0d6a8b92 Jan 21 15:38:04 crc kubenswrapper[4773]: I0121 15:38:04.842171 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-vqt8s" event={"ID":"bd44c2ca-cc7f-432a-89b8-02a06428b3c9","Type":"ContainerStarted","Data":"a8be24ef607cd5c9c29bfb17a5be51c3a9d1148671efd5fca469ae8c0d6a8b92"} Jan 21 15:38:09 crc kubenswrapper[4773]: I0121 15:38:09.869333 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-vqt8s" event={"ID":"bd44c2ca-cc7f-432a-89b8-02a06428b3c9","Type":"ContainerStarted","Data":"bbea22782bde6ec7d2a063bdfff9960c4faa7b4ae5bd134d52d4245b9e580e88"} Jan 21 15:38:09 crc kubenswrapper[4773]: I0121 15:38:09.889457 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-vqt8s" podStartSLOduration=1.607002888 podStartE2EDuration="6.889434899s" podCreationTimestamp="2026-01-21 15:38:03 +0000 UTC" firstStartedPulling="2026-01-21 15:38:03.855683021 +0000 UTC m=+848.780172643" lastFinishedPulling="2026-01-21 15:38:09.138115042 +0000 UTC m=+854.062604654" observedRunningTime="2026-01-21 15:38:09.884600002 +0000 UTC m=+854.809089644" watchObservedRunningTime="2026-01-21 15:38:09.889434899 +0000 UTC m=+854.813924521" Jan 21 15:38:10 crc kubenswrapper[4773]: I0121 15:38:10.959015 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-26trv"] Jan 21 15:38:10 crc kubenswrapper[4773]: I0121 15:38:10.960460 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-26trv" Jan 21 15:38:10 crc kubenswrapper[4773]: I0121 15:38:10.964210 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-zm4w6" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.013174 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95"] Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.014180 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.017986 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.024032 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-26trv"] Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.034820 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95"] Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.041132 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-xv8b2"] Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.042108 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.123389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdq9b\" (UniqueName: \"kubernetes.io/projected/796aeb37-4f9a-401e-ad8d-5a9da9487e56-kube-api-access-qdq9b\") pod \"nmstate-metrics-54757c584b-26trv\" (UID: \"796aeb37-4f9a-401e-ad8d-5a9da9487e56\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-26trv" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.123471 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7f48\" (UniqueName: \"kubernetes.io/projected/7587ffb6-642a-4676-a627-1c77024022b2-kube-api-access-k7f48\") pod \"nmstate-webhook-8474b5b9d8-22t95\" (UID: \"7587ffb6-642a-4676-a627-1c77024022b2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.123520 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7587ffb6-642a-4676-a627-1c77024022b2-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-22t95\" (UID: \"7587ffb6-642a-4676-a627-1c77024022b2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.127661 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc"] Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.128616 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.132579 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.132579 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.141224 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc"] Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.141740 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-fn66n" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.225275 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7f48\" (UniqueName: \"kubernetes.io/projected/7587ffb6-642a-4676-a627-1c77024022b2-kube-api-access-k7f48\") pod \"nmstate-webhook-8474b5b9d8-22t95\" (UID: \"7587ffb6-642a-4676-a627-1c77024022b2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.225324 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4e6397af-d97a-44cb-8e6d-babc6dab33c4-ovs-socket\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.225351 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drwlz\" (UniqueName: \"kubernetes.io/projected/4e6397af-d97a-44cb-8e6d-babc6dab33c4-kube-api-access-drwlz\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.225390 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4e6397af-d97a-44cb-8e6d-babc6dab33c4-nmstate-lock\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.225415 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7587ffb6-642a-4676-a627-1c77024022b2-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-22t95\" (UID: \"7587ffb6-642a-4676-a627-1c77024022b2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.225432 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4e6397af-d97a-44cb-8e6d-babc6dab33c4-dbus-socket\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.225485 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdq9b\" (UniqueName: \"kubernetes.io/projected/796aeb37-4f9a-401e-ad8d-5a9da9487e56-kube-api-access-qdq9b\") pod \"nmstate-metrics-54757c584b-26trv\" (UID: \"796aeb37-4f9a-401e-ad8d-5a9da9487e56\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-26trv" Jan 21 15:38:11 crc kubenswrapper[4773]: E0121 15:38:11.225659 4773 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 21 15:38:11 crc kubenswrapper[4773]: E0121 15:38:11.225749 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7587ffb6-642a-4676-a627-1c77024022b2-tls-key-pair podName:7587ffb6-642a-4676-a627-1c77024022b2 nodeName:}" failed. No retries permitted until 2026-01-21 15:38:11.725727483 +0000 UTC m=+856.650217105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/7587ffb6-642a-4676-a627-1c77024022b2-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-22t95" (UID: "7587ffb6-642a-4676-a627-1c77024022b2") : secret "openshift-nmstate-webhook" not found Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.250899 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7f48\" (UniqueName: \"kubernetes.io/projected/7587ffb6-642a-4676-a627-1c77024022b2-kube-api-access-k7f48\") pod \"nmstate-webhook-8474b5b9d8-22t95\" (UID: \"7587ffb6-642a-4676-a627-1c77024022b2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.250911 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdq9b\" (UniqueName: \"kubernetes.io/projected/796aeb37-4f9a-401e-ad8d-5a9da9487e56-kube-api-access-qdq9b\") pod \"nmstate-metrics-54757c584b-26trv\" (UID: \"796aeb37-4f9a-401e-ad8d-5a9da9487e56\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-26trv" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.283145 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-26trv" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.309941 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-774595c759-vtsv7"] Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.310666 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.323761 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-774595c759-vtsv7"] Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332381 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4e6397af-d97a-44cb-8e6d-babc6dab33c4-nmstate-lock\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332429 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4e6397af-d97a-44cb-8e6d-babc6dab33c4-dbus-socket\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332478 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-console-config\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-service-ca\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332547 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7b9k\" (UniqueName: \"kubernetes.io/projected/100f2262-430a-4dd1-a8a2-2cfc06f6e345-kube-api-access-z7b9k\") pod \"nmstate-console-plugin-7754f76f8b-x2tkc\" (UID: \"100f2262-430a-4dd1-a8a2-2cfc06f6e345\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332573 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-console-oauth-config\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332593 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-trusted-ca-bundle\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332612 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-console-serving-cert\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332636 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnql\" (UniqueName: \"kubernetes.io/projected/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-kube-api-access-rvnql\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332670 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/100f2262-430a-4dd1-a8a2-2cfc06f6e345-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-x2tkc\" (UID: \"100f2262-430a-4dd1-a8a2-2cfc06f6e345\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332714 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/100f2262-430a-4dd1-a8a2-2cfc06f6e345-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-x2tkc\" (UID: \"100f2262-430a-4dd1-a8a2-2cfc06f6e345\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332737 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4e6397af-d97a-44cb-8e6d-babc6dab33c4-dbus-socket\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332741 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4e6397af-d97a-44cb-8e6d-babc6dab33c4-ovs-socket\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332771 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drwlz\" (UniqueName: \"kubernetes.io/projected/4e6397af-d97a-44cb-8e6d-babc6dab33c4-kube-api-access-drwlz\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332511 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4e6397af-d97a-44cb-8e6d-babc6dab33c4-nmstate-lock\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332847 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4e6397af-d97a-44cb-8e6d-babc6dab33c4-ovs-socket\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.332931 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-oauth-serving-cert\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.358632 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drwlz\" (UniqueName: \"kubernetes.io/projected/4e6397af-d97a-44cb-8e6d-babc6dab33c4-kube-api-access-drwlz\") pod \"nmstate-handler-xv8b2\" (UID: \"4e6397af-d97a-44cb-8e6d-babc6dab33c4\") " pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.361965 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.433903 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-console-oauth-config\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.433948 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-console-serving-cert\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.433963 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-trusted-ca-bundle\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.433984 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnql\" (UniqueName: \"kubernetes.io/projected/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-kube-api-access-rvnql\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.434951 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/100f2262-430a-4dd1-a8a2-2cfc06f6e345-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-x2tkc\" (UID: \"100f2262-430a-4dd1-a8a2-2cfc06f6e345\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.434979 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/100f2262-430a-4dd1-a8a2-2cfc06f6e345-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-x2tkc\" (UID: \"100f2262-430a-4dd1-a8a2-2cfc06f6e345\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.435008 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-oauth-serving-cert\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.435129 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-console-config\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.435148 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-service-ca\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.435169 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7b9k\" (UniqueName: \"kubernetes.io/projected/100f2262-430a-4dd1-a8a2-2cfc06f6e345-kube-api-access-z7b9k\") pod \"nmstate-console-plugin-7754f76f8b-x2tkc\" (UID: \"100f2262-430a-4dd1-a8a2-2cfc06f6e345\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.436321 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/100f2262-430a-4dd1-a8a2-2cfc06f6e345-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-x2tkc\" (UID: \"100f2262-430a-4dd1-a8a2-2cfc06f6e345\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.436890 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-oauth-serving-cert\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.437035 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-console-config\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: E0121 15:38:11.437103 4773 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Jan 21 15:38:11 crc kubenswrapper[4773]: E0121 15:38:11.437144 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/100f2262-430a-4dd1-a8a2-2cfc06f6e345-plugin-serving-cert podName:100f2262-430a-4dd1-a8a2-2cfc06f6e345 nodeName:}" failed. No retries permitted until 2026-01-21 15:38:11.937129936 +0000 UTC m=+856.861619558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/100f2262-430a-4dd1-a8a2-2cfc06f6e345-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-x2tkc" (UID: "100f2262-430a-4dd1-a8a2-2cfc06f6e345") : secret "plugin-serving-cert" not found Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.439496 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-trusted-ca-bundle\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.440976 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-service-ca\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.455523 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-console-oauth-config\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.460861 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-console-serving-cert\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.461803 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7b9k\" (UniqueName: \"kubernetes.io/projected/100f2262-430a-4dd1-a8a2-2cfc06f6e345-kube-api-access-z7b9k\") pod \"nmstate-console-plugin-7754f76f8b-x2tkc\" (UID: \"100f2262-430a-4dd1-a8a2-2cfc06f6e345\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.463954 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnql\" (UniqueName: \"kubernetes.io/projected/0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c-kube-api-access-rvnql\") pod \"console-774595c759-vtsv7\" (UID: \"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c\") " pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.709837 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.737940 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7587ffb6-642a-4676-a627-1c77024022b2-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-22t95\" (UID: \"7587ffb6-642a-4676-a627-1c77024022b2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.741319 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7587ffb6-642a-4676-a627-1c77024022b2-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-22t95\" (UID: \"7587ffb6-642a-4676-a627-1c77024022b2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.780980 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-26trv"] Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.881567 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xv8b2" event={"ID":"4e6397af-d97a-44cb-8e6d-babc6dab33c4","Type":"ContainerStarted","Data":"5b450682e07962573bf0087214b28043d658e1bef24acdee5d494c3942624b62"} Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.883432 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-26trv" event={"ID":"796aeb37-4f9a-401e-ad8d-5a9da9487e56","Type":"ContainerStarted","Data":"da8919257479c640b52d90357a3ebdd381a8799008fa87e1ac839c199586b94b"} Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.911330 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-774595c759-vtsv7"] Jan 21 15:38:11 crc kubenswrapper[4773]: W0121 15:38:11.914524 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e973547_b6d2_4ef7_bf66_37b7dd5d4f7c.slice/crio-58cda75347c6ba1772448e6977a7552e2805f40f0dc6bc82dbec6b4e5f577c38 WatchSource:0}: Error finding container 58cda75347c6ba1772448e6977a7552e2805f40f0dc6bc82dbec6b4e5f577c38: Status 404 returned error can't find the container with id 58cda75347c6ba1772448e6977a7552e2805f40f0dc6bc82dbec6b4e5f577c38 Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.942219 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.942276 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/100f2262-430a-4dd1-a8a2-2cfc06f6e345-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-x2tkc\" (UID: \"100f2262-430a-4dd1-a8a2-2cfc06f6e345\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:11 crc kubenswrapper[4773]: I0121 15:38:11.947116 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/100f2262-430a-4dd1-a8a2-2cfc06f6e345-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-x2tkc\" (UID: \"100f2262-430a-4dd1-a8a2-2cfc06f6e345\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:12 crc kubenswrapper[4773]: I0121 15:38:12.044218 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" Jan 21 15:38:12 crc kubenswrapper[4773]: I0121 15:38:12.136238 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95"] Jan 21 15:38:12 crc kubenswrapper[4773]: W0121 15:38:12.144005 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7587ffb6_642a_4676_a627_1c77024022b2.slice/crio-3f2c641231db0bdd7545e8ccd4270edd6f89beb51b6dbcf92ca2632511d9b1ad WatchSource:0}: Error finding container 3f2c641231db0bdd7545e8ccd4270edd6f89beb51b6dbcf92ca2632511d9b1ad: Status 404 returned error can't find the container with id 3f2c641231db0bdd7545e8ccd4270edd6f89beb51b6dbcf92ca2632511d9b1ad Jan 21 15:38:12 crc kubenswrapper[4773]: I0121 15:38:12.238877 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc"] Jan 21 15:38:12 crc kubenswrapper[4773]: W0121 15:38:12.243864 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod100f2262_430a_4dd1_a8a2_2cfc06f6e345.slice/crio-bb07d5b221dfdbf6007e57df77d0496b4e9496979c432c927ebf157846b43ca8 WatchSource:0}: Error finding container bb07d5b221dfdbf6007e57df77d0496b4e9496979c432c927ebf157846b43ca8: Status 404 returned error can't find the container with id bb07d5b221dfdbf6007e57df77d0496b4e9496979c432c927ebf157846b43ca8 Jan 21 15:38:12 crc kubenswrapper[4773]: I0121 15:38:12.891322 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" event={"ID":"7587ffb6-642a-4676-a627-1c77024022b2","Type":"ContainerStarted","Data":"3f2c641231db0bdd7545e8ccd4270edd6f89beb51b6dbcf92ca2632511d9b1ad"} Jan 21 15:38:12 crc kubenswrapper[4773]: I0121 15:38:12.892873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" event={"ID":"100f2262-430a-4dd1-a8a2-2cfc06f6e345","Type":"ContainerStarted","Data":"bb07d5b221dfdbf6007e57df77d0496b4e9496979c432c927ebf157846b43ca8"} Jan 21 15:38:12 crc kubenswrapper[4773]: I0121 15:38:12.894135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-774595c759-vtsv7" event={"ID":"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c","Type":"ContainerStarted","Data":"58cda75347c6ba1772448e6977a7552e2805f40f0dc6bc82dbec6b4e5f577c38"} Jan 21 15:38:15 crc kubenswrapper[4773]: I0121 15:38:15.917213 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-774595c759-vtsv7" event={"ID":"0e973547-b6d2-4ef7-bf66-37b7dd5d4f7c","Type":"ContainerStarted","Data":"5e1fae192b757619137eb6e5f5ef0a6a6429536176fbd3985ba34103a07681a7"} Jan 21 15:38:19 crc kubenswrapper[4773]: I0121 15:38:19.967425 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-26trv" event={"ID":"796aeb37-4f9a-401e-ad8d-5a9da9487e56","Type":"ContainerStarted","Data":"6712f5cfb1e4e63de648c2b9a10fdcb98c95597268f2ecd14e586e37557845e0"} Jan 21 15:38:19 crc kubenswrapper[4773]: I0121 15:38:19.969463 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" Jan 21 15:38:19 crc kubenswrapper[4773]: I0121 15:38:19.969490 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" event={"ID":"7587ffb6-642a-4676-a627-1c77024022b2","Type":"ContainerStarted","Data":"0d5da9ad198fdcb8670ef5f0120d0f6cc7c9a9550bb251a3eb49685eaf152e18"} Jan 21 15:38:19 crc kubenswrapper[4773]: I0121 15:38:19.970360 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" event={"ID":"100f2262-430a-4dd1-a8a2-2cfc06f6e345","Type":"ContainerStarted","Data":"c81cb3f97bf04ac3f9c97a2cc379aa4d5782f23955e29f07a8fbb67cf1394535"} Jan 21 15:38:19 crc kubenswrapper[4773]: I0121 15:38:19.971769 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xv8b2" event={"ID":"4e6397af-d97a-44cb-8e6d-babc6dab33c4","Type":"ContainerStarted","Data":"eff6ecfaac32534a9d1216ab6a34ce77876fa609de803581c6623792a8b53bc8"} Jan 21 15:38:19 crc kubenswrapper[4773]: I0121 15:38:19.971993 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:19 crc kubenswrapper[4773]: I0121 15:38:19.986412 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" podStartSLOduration=3.360073575 podStartE2EDuration="9.986393197s" podCreationTimestamp="2026-01-21 15:38:10 +0000 UTC" firstStartedPulling="2026-01-21 15:38:12.154167172 +0000 UTC m=+857.078656794" lastFinishedPulling="2026-01-21 15:38:18.780486794 +0000 UTC m=+863.704976416" observedRunningTime="2026-01-21 15:38:19.984736531 +0000 UTC m=+864.909226153" watchObservedRunningTime="2026-01-21 15:38:19.986393197 +0000 UTC m=+864.910882819" Jan 21 15:38:19 crc kubenswrapper[4773]: I0121 15:38:19.992729 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-774595c759-vtsv7" podStartSLOduration=8.992687336 podStartE2EDuration="8.992687336s" podCreationTimestamp="2026-01-21 15:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:38:15.940541348 +0000 UTC m=+860.865030970" watchObservedRunningTime="2026-01-21 15:38:19.992687336 +0000 UTC m=+864.917176958" Jan 21 15:38:20 crc kubenswrapper[4773]: I0121 15:38:20.017350 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-xv8b2" podStartSLOduration=3.016310586 podStartE2EDuration="10.017331623s" podCreationTimestamp="2026-01-21 15:38:10 +0000 UTC" firstStartedPulling="2026-01-21 15:38:11.415797703 +0000 UTC m=+856.340287325" lastFinishedPulling="2026-01-21 15:38:18.41681874 +0000 UTC m=+863.341308362" observedRunningTime="2026-01-21 15:38:20.013645379 +0000 UTC m=+864.938135021" watchObservedRunningTime="2026-01-21 15:38:20.017331623 +0000 UTC m=+864.941821245" Jan 21 15:38:20 crc kubenswrapper[4773]: I0121 15:38:20.020165 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-x2tkc" podStartSLOduration=2.403888586 podStartE2EDuration="9.020157053s" podCreationTimestamp="2026-01-21 15:38:11 +0000 UTC" firstStartedPulling="2026-01-21 15:38:12.245615521 +0000 UTC m=+857.170105143" lastFinishedPulling="2026-01-21 15:38:18.861883988 +0000 UTC m=+863.786373610" observedRunningTime="2026-01-21 15:38:19.99955662 +0000 UTC m=+864.924046242" watchObservedRunningTime="2026-01-21 15:38:20.020157053 +0000 UTC m=+864.944646685" Jan 21 15:38:21 crc kubenswrapper[4773]: I0121 15:38:21.710268 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:21 crc kubenswrapper[4773]: I0121 15:38:21.710330 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:21 crc kubenswrapper[4773]: I0121 15:38:21.716166 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:21 crc kubenswrapper[4773]: I0121 15:38:21.988743 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-774595c759-vtsv7" Jan 21 15:38:22 crc kubenswrapper[4773]: I0121 15:38:22.031536 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xwn45"] Jan 21 15:38:22 crc kubenswrapper[4773]: I0121 15:38:22.993893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-26trv" event={"ID":"796aeb37-4f9a-401e-ad8d-5a9da9487e56","Type":"ContainerStarted","Data":"89ae8d19595a5bab7618d109e8d900e2fed44012b5d94d0a8588c26da316fb15"} Jan 21 15:38:23 crc kubenswrapper[4773]: I0121 15:38:23.009555 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-26trv" podStartSLOduration=2.708600895 podStartE2EDuration="13.009534838s" podCreationTimestamp="2026-01-21 15:38:10 +0000 UTC" firstStartedPulling="2026-01-21 15:38:11.814050366 +0000 UTC m=+856.738539988" lastFinishedPulling="2026-01-21 15:38:22.114984309 +0000 UTC m=+867.039473931" observedRunningTime="2026-01-21 15:38:23.00816463 +0000 UTC m=+867.932654252" watchObservedRunningTime="2026-01-21 15:38:23.009534838 +0000 UTC m=+867.934024460" Jan 21 15:38:26 crc kubenswrapper[4773]: I0121 15:38:26.392675 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-xv8b2" Jan 21 15:38:31 crc kubenswrapper[4773]: I0121 15:38:31.948405 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-22t95" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.073394 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-xwn45" podUID="e26a3952-09c7-455b-ac02-a18c778eec8e" containerName="console" containerID="cri-o://ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f" gracePeriod=15 Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.703655 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xwn45_e26a3952-09c7-455b-ac02-a18c778eec8e/console/0.log" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.704101 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.869503 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-service-ca\") pod \"e26a3952-09c7-455b-ac02-a18c778eec8e\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.869816 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhxcg\" (UniqueName: \"kubernetes.io/projected/e26a3952-09c7-455b-ac02-a18c778eec8e-kube-api-access-bhxcg\") pod \"e26a3952-09c7-455b-ac02-a18c778eec8e\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.869867 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-console-config\") pod \"e26a3952-09c7-455b-ac02-a18c778eec8e\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.869891 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-oauth-serving-cert\") pod \"e26a3952-09c7-455b-ac02-a18c778eec8e\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.869918 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-oauth-config\") pod \"e26a3952-09c7-455b-ac02-a18c778eec8e\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.869941 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-serving-cert\") pod \"e26a3952-09c7-455b-ac02-a18c778eec8e\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.869957 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-trusted-ca-bundle\") pod \"e26a3952-09c7-455b-ac02-a18c778eec8e\" (UID: \"e26a3952-09c7-455b-ac02-a18c778eec8e\") " Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.870868 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e26a3952-09c7-455b-ac02-a18c778eec8e" (UID: "e26a3952-09c7-455b-ac02-a18c778eec8e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.870946 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e26a3952-09c7-455b-ac02-a18c778eec8e" (UID: "e26a3952-09c7-455b-ac02-a18c778eec8e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.871044 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-service-ca" (OuterVolumeSpecName: "service-ca") pod "e26a3952-09c7-455b-ac02-a18c778eec8e" (UID: "e26a3952-09c7-455b-ac02-a18c778eec8e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.871152 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-console-config" (OuterVolumeSpecName: "console-config") pod "e26a3952-09c7-455b-ac02-a18c778eec8e" (UID: "e26a3952-09c7-455b-ac02-a18c778eec8e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.875562 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26a3952-09c7-455b-ac02-a18c778eec8e-kube-api-access-bhxcg" (OuterVolumeSpecName: "kube-api-access-bhxcg") pod "e26a3952-09c7-455b-ac02-a18c778eec8e" (UID: "e26a3952-09c7-455b-ac02-a18c778eec8e"). InnerVolumeSpecName "kube-api-access-bhxcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.876831 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e26a3952-09c7-455b-ac02-a18c778eec8e" (UID: "e26a3952-09c7-455b-ac02-a18c778eec8e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.884377 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e26a3952-09c7-455b-ac02-a18c778eec8e" (UID: "e26a3952-09c7-455b-ac02-a18c778eec8e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.971174 4773 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-console-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.971246 4773 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.971260 4773 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.971272 4773 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e26a3952-09c7-455b-ac02-a18c778eec8e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.971283 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.971295 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e26a3952-09c7-455b-ac02-a18c778eec8e-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:47 crc kubenswrapper[4773]: I0121 15:38:47.971305 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhxcg\" (UniqueName: \"kubernetes.io/projected/e26a3952-09c7-455b-ac02-a18c778eec8e-kube-api-access-bhxcg\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:48 crc kubenswrapper[4773]: I0121 15:38:48.150796 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xwn45_e26a3952-09c7-455b-ac02-a18c778eec8e/console/0.log" Jan 21 15:38:48 crc kubenswrapper[4773]: I0121 15:38:48.150968 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xwn45" Jan 21 15:38:48 crc kubenswrapper[4773]: I0121 15:38:48.150885 4773 generic.go:334] "Generic (PLEG): container finished" podID="e26a3952-09c7-455b-ac02-a18c778eec8e" containerID="ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f" exitCode=2 Jan 21 15:38:48 crc kubenswrapper[4773]: I0121 15:38:48.151057 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xwn45" event={"ID":"e26a3952-09c7-455b-ac02-a18c778eec8e","Type":"ContainerDied","Data":"ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f"} Jan 21 15:38:48 crc kubenswrapper[4773]: I0121 15:38:48.151096 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xwn45" event={"ID":"e26a3952-09c7-455b-ac02-a18c778eec8e","Type":"ContainerDied","Data":"a4f67995b950b15aa0355cdb8768036c75a86a85beb3e8b21376f870a16a2cbf"} Jan 21 15:38:48 crc kubenswrapper[4773]: I0121 15:38:48.151118 4773 scope.go:117] "RemoveContainer" containerID="ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f" Jan 21 15:38:48 crc kubenswrapper[4773]: I0121 15:38:48.187021 4773 scope.go:117] "RemoveContainer" containerID="ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f" Jan 21 15:38:48 crc kubenswrapper[4773]: E0121 15:38:48.187543 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f\": container with ID starting with ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f not found: ID does not exist" containerID="ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f" Jan 21 15:38:48 crc kubenswrapper[4773]: I0121 15:38:48.187593 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f"} err="failed to get container status \"ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f\": rpc error: code = NotFound desc = could not find container \"ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f\": container with ID starting with ce6b8a16e3eebd1adcc151f7f7abba17c000c602c8ffaa1f0bce7d8642815f1f not found: ID does not exist" Jan 21 15:38:48 crc kubenswrapper[4773]: I0121 15:38:48.192329 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xwn45"] Jan 21 15:38:48 crc kubenswrapper[4773]: I0121 15:38:48.197416 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-xwn45"] Jan 21 15:38:49 crc kubenswrapper[4773]: I0121 15:38:49.393224 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26a3952-09c7-455b-ac02-a18c778eec8e" path="/var/lib/kubelet/pods/e26a3952-09c7-455b-ac02-a18c778eec8e/volumes" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.095783 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm"] Jan 21 15:38:53 crc kubenswrapper[4773]: E0121 15:38:53.096507 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26a3952-09c7-455b-ac02-a18c778eec8e" containerName="console" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.096523 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26a3952-09c7-455b-ac02-a18c778eec8e" containerName="console" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.096638 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26a3952-09c7-455b-ac02-a18c778eec8e" containerName="console" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.097465 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.099204 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.109284 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm"] Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.235839 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.235949 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97szg\" (UniqueName: \"kubernetes.io/projected/6069f608-c03c-4128-ac35-0b5de3f22145-kube-api-access-97szg\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.236062 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.338091 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.338237 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97szg\" (UniqueName: \"kubernetes.io/projected/6069f608-c03c-4128-ac35-0b5de3f22145-kube-api-access-97szg\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.338288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.338760 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.338760 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.360473 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97szg\" (UniqueName: \"kubernetes.io/projected/6069f608-c03c-4128-ac35-0b5de3f22145-kube-api-access-97szg\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.420869 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:53 crc kubenswrapper[4773]: I0121 15:38:53.609657 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm"] Jan 21 15:38:53 crc kubenswrapper[4773]: W0121 15:38:53.614074 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6069f608_c03c_4128_ac35_0b5de3f22145.slice/crio-69b63e3a028e1a1eb33edec53b91807b335606364fcb7764a7b515fb82ef7b5f WatchSource:0}: Error finding container 69b63e3a028e1a1eb33edec53b91807b335606364fcb7764a7b515fb82ef7b5f: Status 404 returned error can't find the container with id 69b63e3a028e1a1eb33edec53b91807b335606364fcb7764a7b515fb82ef7b5f Jan 21 15:38:54 crc kubenswrapper[4773]: I0121 15:38:54.188903 4773 generic.go:334] "Generic (PLEG): container finished" podID="6069f608-c03c-4128-ac35-0b5de3f22145" containerID="aed8b870e7256fca8b434849e9f92b07f491fb5e6d5b27b9d80fc210733df0e2" exitCode=0 Jan 21 15:38:54 crc kubenswrapper[4773]: I0121 15:38:54.188981 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" event={"ID":"6069f608-c03c-4128-ac35-0b5de3f22145","Type":"ContainerDied","Data":"aed8b870e7256fca8b434849e9f92b07f491fb5e6d5b27b9d80fc210733df0e2"} Jan 21 15:38:54 crc kubenswrapper[4773]: I0121 15:38:54.189247 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" event={"ID":"6069f608-c03c-4128-ac35-0b5de3f22145","Type":"ContainerStarted","Data":"69b63e3a028e1a1eb33edec53b91807b335606364fcb7764a7b515fb82ef7b5f"} Jan 21 15:38:56 crc kubenswrapper[4773]: I0121 15:38:56.203306 4773 generic.go:334] "Generic (PLEG): container finished" podID="6069f608-c03c-4128-ac35-0b5de3f22145" containerID="d7c89cc9379bff860e5cebceb196bb202628955318d22a9e8ff00fa06db3edad" exitCode=0 Jan 21 15:38:56 crc kubenswrapper[4773]: I0121 15:38:56.203402 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" event={"ID":"6069f608-c03c-4128-ac35-0b5de3f22145","Type":"ContainerDied","Data":"d7c89cc9379bff860e5cebceb196bb202628955318d22a9e8ff00fa06db3edad"} Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.213052 4773 generic.go:334] "Generic (PLEG): container finished" podID="6069f608-c03c-4128-ac35-0b5de3f22145" containerID="dd4b0d9999a6a8bdf4ab89b3a7c27b59d5a45536b2fdde6721487f680ce8a410" exitCode=0 Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.213103 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" event={"ID":"6069f608-c03c-4128-ac35-0b5de3f22145","Type":"ContainerDied","Data":"dd4b0d9999a6a8bdf4ab89b3a7c27b59d5a45536b2fdde6721487f680ce8a410"} Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.453187 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qpmc2"] Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.454646 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.470561 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpmc2"] Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.592516 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-utilities\") pod \"community-operators-qpmc2\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.592566 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-catalog-content\") pod \"community-operators-qpmc2\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.592604 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn9s5\" (UniqueName: \"kubernetes.io/projected/5dde2d17-a941-4beb-9be7-c265c447685b-kube-api-access-wn9s5\") pod \"community-operators-qpmc2\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.693948 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn9s5\" (UniqueName: \"kubernetes.io/projected/5dde2d17-a941-4beb-9be7-c265c447685b-kube-api-access-wn9s5\") pod \"community-operators-qpmc2\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.694090 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-utilities\") pod \"community-operators-qpmc2\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.694113 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-catalog-content\") pod \"community-operators-qpmc2\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.694576 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-catalog-content\") pod \"community-operators-qpmc2\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.694714 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-utilities\") pod \"community-operators-qpmc2\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.719570 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn9s5\" (UniqueName: \"kubernetes.io/projected/5dde2d17-a941-4beb-9be7-c265c447685b-kube-api-access-wn9s5\") pod \"community-operators-qpmc2\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:38:57 crc kubenswrapper[4773]: I0121 15:38:57.770085 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:38:58 crc kubenswrapper[4773]: I0121 15:38:58.239682 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpmc2"] Jan 21 15:38:58 crc kubenswrapper[4773]: W0121 15:38:58.261189 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dde2d17_a941_4beb_9be7_c265c447685b.slice/crio-09af5d9c1d753911225ae8279c95e30473e497fb8154e65b701bc6730ebf3f17 WatchSource:0}: Error finding container 09af5d9c1d753911225ae8279c95e30473e497fb8154e65b701bc6730ebf3f17: Status 404 returned error can't find the container with id 09af5d9c1d753911225ae8279c95e30473e497fb8154e65b701bc6730ebf3f17 Jan 21 15:38:58 crc kubenswrapper[4773]: I0121 15:38:58.509633 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:58 crc kubenswrapper[4773]: I0121 15:38:58.609338 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-util\") pod \"6069f608-c03c-4128-ac35-0b5de3f22145\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " Jan 21 15:38:58 crc kubenswrapper[4773]: I0121 15:38:58.609480 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97szg\" (UniqueName: \"kubernetes.io/projected/6069f608-c03c-4128-ac35-0b5de3f22145-kube-api-access-97szg\") pod \"6069f608-c03c-4128-ac35-0b5de3f22145\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " Jan 21 15:38:58 crc kubenswrapper[4773]: I0121 15:38:58.609588 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-bundle\") pod \"6069f608-c03c-4128-ac35-0b5de3f22145\" (UID: \"6069f608-c03c-4128-ac35-0b5de3f22145\") " Jan 21 15:38:58 crc kubenswrapper[4773]: I0121 15:38:58.610678 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-bundle" (OuterVolumeSpecName: "bundle") pod "6069f608-c03c-4128-ac35-0b5de3f22145" (UID: "6069f608-c03c-4128-ac35-0b5de3f22145"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:58 crc kubenswrapper[4773]: I0121 15:38:58.615528 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6069f608-c03c-4128-ac35-0b5de3f22145-kube-api-access-97szg" (OuterVolumeSpecName: "kube-api-access-97szg") pod "6069f608-c03c-4128-ac35-0b5de3f22145" (UID: "6069f608-c03c-4128-ac35-0b5de3f22145"). InnerVolumeSpecName "kube-api-access-97szg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:38:58 crc kubenswrapper[4773]: I0121 15:38:58.623930 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-util" (OuterVolumeSpecName: "util") pod "6069f608-c03c-4128-ac35-0b5de3f22145" (UID: "6069f608-c03c-4128-ac35-0b5de3f22145"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:38:58 crc kubenswrapper[4773]: I0121 15:38:58.710884 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:58 crc kubenswrapper[4773]: I0121 15:38:58.710953 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6069f608-c03c-4128-ac35-0b5de3f22145-util\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:58 crc kubenswrapper[4773]: I0121 15:38:58.710964 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97szg\" (UniqueName: \"kubernetes.io/projected/6069f608-c03c-4128-ac35-0b5de3f22145-kube-api-access-97szg\") on node \"crc\" DevicePath \"\"" Jan 21 15:38:59 crc kubenswrapper[4773]: I0121 15:38:59.226044 4773 generic.go:334] "Generic (PLEG): container finished" podID="5dde2d17-a941-4beb-9be7-c265c447685b" containerID="1b96acf4681824dd61dd1e5bbd5b3246eaddc0c54851a31d4c6fe19af506be29" exitCode=0 Jan 21 15:38:59 crc kubenswrapper[4773]: I0121 15:38:59.226146 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpmc2" event={"ID":"5dde2d17-a941-4beb-9be7-c265c447685b","Type":"ContainerDied","Data":"1b96acf4681824dd61dd1e5bbd5b3246eaddc0c54851a31d4c6fe19af506be29"} Jan 21 15:38:59 crc kubenswrapper[4773]: I0121 15:38:59.226177 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpmc2" event={"ID":"5dde2d17-a941-4beb-9be7-c265c447685b","Type":"ContainerStarted","Data":"09af5d9c1d753911225ae8279c95e30473e497fb8154e65b701bc6730ebf3f17"} Jan 21 15:38:59 crc kubenswrapper[4773]: I0121 15:38:59.229981 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" event={"ID":"6069f608-c03c-4128-ac35-0b5de3f22145","Type":"ContainerDied","Data":"69b63e3a028e1a1eb33edec53b91807b335606364fcb7764a7b515fb82ef7b5f"} Jan 21 15:38:59 crc kubenswrapper[4773]: I0121 15:38:59.230035 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm" Jan 21 15:38:59 crc kubenswrapper[4773]: I0121 15:38:59.230046 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69b63e3a028e1a1eb33edec53b91807b335606364fcb7764a7b515fb82ef7b5f" Jan 21 15:39:00 crc kubenswrapper[4773]: I0121 15:39:00.236553 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpmc2" event={"ID":"5dde2d17-a941-4beb-9be7-c265c447685b","Type":"ContainerStarted","Data":"7c1ae240bd1b47fdcc8307649346bf1ddcb21609e7dea22651c327f104c2d961"} Jan 21 15:39:01 crc kubenswrapper[4773]: I0121 15:39:01.244459 4773 generic.go:334] "Generic (PLEG): container finished" podID="5dde2d17-a941-4beb-9be7-c265c447685b" containerID="7c1ae240bd1b47fdcc8307649346bf1ddcb21609e7dea22651c327f104c2d961" exitCode=0 Jan 21 15:39:01 crc kubenswrapper[4773]: I0121 15:39:01.244510 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpmc2" event={"ID":"5dde2d17-a941-4beb-9be7-c265c447685b","Type":"ContainerDied","Data":"7c1ae240bd1b47fdcc8307649346bf1ddcb21609e7dea22651c327f104c2d961"} Jan 21 15:39:02 crc kubenswrapper[4773]: I0121 15:39:02.252332 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpmc2" event={"ID":"5dde2d17-a941-4beb-9be7-c265c447685b","Type":"ContainerStarted","Data":"c8216e115643a02b4733e2c8a0bb28132ea612d8a025d37974926096a6f03e1c"} Jan 21 15:39:02 crc kubenswrapper[4773]: I0121 15:39:02.268478 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qpmc2" podStartSLOduration=2.561360034 podStartE2EDuration="5.268459299s" podCreationTimestamp="2026-01-21 15:38:57 +0000 UTC" firstStartedPulling="2026-01-21 15:38:59.227497983 +0000 UTC m=+904.151987625" lastFinishedPulling="2026-01-21 15:39:01.934597268 +0000 UTC m=+906.859086890" observedRunningTime="2026-01-21 15:39:02.267190763 +0000 UTC m=+907.191680385" watchObservedRunningTime="2026-01-21 15:39:02.268459299 +0000 UTC m=+907.192948911" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.055829 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4gktx"] Jan 21 15:39:03 crc kubenswrapper[4773]: E0121 15:39:03.056128 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6069f608-c03c-4128-ac35-0b5de3f22145" containerName="util" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.056142 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6069f608-c03c-4128-ac35-0b5de3f22145" containerName="util" Jan 21 15:39:03 crc kubenswrapper[4773]: E0121 15:39:03.056155 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6069f608-c03c-4128-ac35-0b5de3f22145" containerName="extract" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.056165 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6069f608-c03c-4128-ac35-0b5de3f22145" containerName="extract" Jan 21 15:39:03 crc kubenswrapper[4773]: E0121 15:39:03.056184 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6069f608-c03c-4128-ac35-0b5de3f22145" containerName="pull" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.056193 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6069f608-c03c-4128-ac35-0b5de3f22145" containerName="pull" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.056338 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6069f608-c03c-4128-ac35-0b5de3f22145" containerName="extract" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.057435 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.067398 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gktx"] Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.163615 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-utilities\") pod \"certified-operators-4gktx\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.163713 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr97k\" (UniqueName: \"kubernetes.io/projected/76180b21-8f56-4ed8-8989-d100960852dd-kube-api-access-tr97k\") pod \"certified-operators-4gktx\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.163741 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-catalog-content\") pod \"certified-operators-4gktx\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.265322 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-catalog-content\") pod \"certified-operators-4gktx\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.265753 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-utilities\") pod \"certified-operators-4gktx\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.265809 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr97k\" (UniqueName: \"kubernetes.io/projected/76180b21-8f56-4ed8-8989-d100960852dd-kube-api-access-tr97k\") pod \"certified-operators-4gktx\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.266166 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-catalog-content\") pod \"certified-operators-4gktx\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.266292 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-utilities\") pod \"certified-operators-4gktx\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.294018 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr97k\" (UniqueName: \"kubernetes.io/projected/76180b21-8f56-4ed8-8989-d100960852dd-kube-api-access-tr97k\") pod \"certified-operators-4gktx\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:03 crc kubenswrapper[4773]: I0121 15:39:03.371604 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:04 crc kubenswrapper[4773]: I0121 15:39:04.162162 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gktx"] Jan 21 15:39:04 crc kubenswrapper[4773]: I0121 15:39:04.265227 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gktx" event={"ID":"76180b21-8f56-4ed8-8989-d100960852dd","Type":"ContainerStarted","Data":"f52d59d5ed65fa5765e519ff2d7b51d14219f018fb2db4de0effc9d5ebbbcbf6"} Jan 21 15:39:05 crc kubenswrapper[4773]: I0121 15:39:05.273317 4773 generic.go:334] "Generic (PLEG): container finished" podID="76180b21-8f56-4ed8-8989-d100960852dd" containerID="7e0f5161385e42da70720eb3ec0214c20be017a808db3648446cf1f553a35cda" exitCode=0 Jan 21 15:39:05 crc kubenswrapper[4773]: I0121 15:39:05.273423 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gktx" event={"ID":"76180b21-8f56-4ed8-8989-d100960852dd","Type":"ContainerDied","Data":"7e0f5161385e42da70720eb3ec0214c20be017a808db3648446cf1f553a35cda"} Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.657003 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-whm2j"] Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.658639 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.665881 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whm2j"] Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.810199 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-catalog-content\") pod \"redhat-marketplace-whm2j\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.810514 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-utilities\") pod \"redhat-marketplace-whm2j\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.810538 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6bhs\" (UniqueName: \"kubernetes.io/projected/8a466272-ba16-4a82-ace4-1f13f147c3cd-kube-api-access-r6bhs\") pod \"redhat-marketplace-whm2j\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.911609 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-catalog-content\") pod \"redhat-marketplace-whm2j\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.912132 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-catalog-content\") pod \"redhat-marketplace-whm2j\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.913972 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-utilities\") pod \"redhat-marketplace-whm2j\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.914262 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-utilities\") pod \"redhat-marketplace-whm2j\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.914299 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6bhs\" (UniqueName: \"kubernetes.io/projected/8a466272-ba16-4a82-ace4-1f13f147c3cd-kube-api-access-r6bhs\") pod \"redhat-marketplace-whm2j\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.933664 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6bhs\" (UniqueName: \"kubernetes.io/projected/8a466272-ba16-4a82-ace4-1f13f147c3cd-kube-api-access-r6bhs\") pod \"redhat-marketplace-whm2j\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:06 crc kubenswrapper[4773]: I0121 15:39:06.977011 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:07 crc kubenswrapper[4773]: I0121 15:39:07.286381 4773 generic.go:334] "Generic (PLEG): container finished" podID="76180b21-8f56-4ed8-8989-d100960852dd" containerID="70e97ef0e2617fb6dbcf47811ad2c5273e2ecdaa3598da5eb8f859b155f83456" exitCode=0 Jan 21 15:39:07 crc kubenswrapper[4773]: I0121 15:39:07.286471 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gktx" event={"ID":"76180b21-8f56-4ed8-8989-d100960852dd","Type":"ContainerDied","Data":"70e97ef0e2617fb6dbcf47811ad2c5273e2ecdaa3598da5eb8f859b155f83456"} Jan 21 15:39:07 crc kubenswrapper[4773]: I0121 15:39:07.462524 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whm2j"] Jan 21 15:39:07 crc kubenswrapper[4773]: W0121 15:39:07.465990 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a466272_ba16_4a82_ace4_1f13f147c3cd.slice/crio-6593f2f8473e1e34189d2060f0337f315ff5a6bbe91c8414f4ee6badfadba8e8 WatchSource:0}: Error finding container 6593f2f8473e1e34189d2060f0337f315ff5a6bbe91c8414f4ee6badfadba8e8: Status 404 returned error can't find the container with id 6593f2f8473e1e34189d2060f0337f315ff5a6bbe91c8414f4ee6badfadba8e8 Jan 21 15:39:07 crc kubenswrapper[4773]: I0121 15:39:07.770463 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:39:07 crc kubenswrapper[4773]: I0121 15:39:07.770571 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:39:07 crc kubenswrapper[4773]: I0121 15:39:07.823794 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:39:08 crc kubenswrapper[4773]: I0121 15:39:08.293831 4773 generic.go:334] "Generic (PLEG): container finished" podID="8a466272-ba16-4a82-ace4-1f13f147c3cd" containerID="6676e8c5646ec846dea71b79162dbe1108973c68b617a31cc358aa19d1728b4b" exitCode=0 Jan 21 15:39:08 crc kubenswrapper[4773]: I0121 15:39:08.293953 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whm2j" event={"ID":"8a466272-ba16-4a82-ace4-1f13f147c3cd","Type":"ContainerDied","Data":"6676e8c5646ec846dea71b79162dbe1108973c68b617a31cc358aa19d1728b4b"} Jan 21 15:39:08 crc kubenswrapper[4773]: I0121 15:39:08.294039 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whm2j" event={"ID":"8a466272-ba16-4a82-ace4-1f13f147c3cd","Type":"ContainerStarted","Data":"6593f2f8473e1e34189d2060f0337f315ff5a6bbe91c8414f4ee6badfadba8e8"} Jan 21 15:39:08 crc kubenswrapper[4773]: I0121 15:39:08.296741 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gktx" event={"ID":"76180b21-8f56-4ed8-8989-d100960852dd","Type":"ContainerStarted","Data":"cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf"} Jan 21 15:39:08 crc kubenswrapper[4773]: I0121 15:39:08.338858 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4gktx" podStartSLOduration=2.882455745 podStartE2EDuration="5.338838613s" podCreationTimestamp="2026-01-21 15:39:03 +0000 UTC" firstStartedPulling="2026-01-21 15:39:05.274891997 +0000 UTC m=+910.199381619" lastFinishedPulling="2026-01-21 15:39:07.731274865 +0000 UTC m=+912.655764487" observedRunningTime="2026-01-21 15:39:08.335065046 +0000 UTC m=+913.259554688" watchObservedRunningTime="2026-01-21 15:39:08.338838613 +0000 UTC m=+913.263328235" Jan 21 15:39:08 crc kubenswrapper[4773]: I0121 15:39:08.347383 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.509583 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw"] Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.510586 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.512383 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.512731 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.513064 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.515917 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.517203 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-d5xh4" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.534881 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw"] Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.653314 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v5n7\" (UniqueName: \"kubernetes.io/projected/bfc5861c-71cc-4485-8fbd-cc661354fe03-kube-api-access-9v5n7\") pod \"metallb-operator-controller-manager-5dcc476cd5-nk2zw\" (UID: \"bfc5861c-71cc-4485-8fbd-cc661354fe03\") " pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.653386 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bfc5861c-71cc-4485-8fbd-cc661354fe03-apiservice-cert\") pod \"metallb-operator-controller-manager-5dcc476cd5-nk2zw\" (UID: \"bfc5861c-71cc-4485-8fbd-cc661354fe03\") " pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.653450 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bfc5861c-71cc-4485-8fbd-cc661354fe03-webhook-cert\") pod \"metallb-operator-controller-manager-5dcc476cd5-nk2zw\" (UID: \"bfc5861c-71cc-4485-8fbd-cc661354fe03\") " pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.754285 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bfc5861c-71cc-4485-8fbd-cc661354fe03-apiservice-cert\") pod \"metallb-operator-controller-manager-5dcc476cd5-nk2zw\" (UID: \"bfc5861c-71cc-4485-8fbd-cc661354fe03\") " pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.754350 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bfc5861c-71cc-4485-8fbd-cc661354fe03-webhook-cert\") pod \"metallb-operator-controller-manager-5dcc476cd5-nk2zw\" (UID: \"bfc5861c-71cc-4485-8fbd-cc661354fe03\") " pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.754405 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v5n7\" (UniqueName: \"kubernetes.io/projected/bfc5861c-71cc-4485-8fbd-cc661354fe03-kube-api-access-9v5n7\") pod \"metallb-operator-controller-manager-5dcc476cd5-nk2zw\" (UID: \"bfc5861c-71cc-4485-8fbd-cc661354fe03\") " pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.830040 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bfc5861c-71cc-4485-8fbd-cc661354fe03-webhook-cert\") pod \"metallb-operator-controller-manager-5dcc476cd5-nk2zw\" (UID: \"bfc5861c-71cc-4485-8fbd-cc661354fe03\") " pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.830128 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bfc5861c-71cc-4485-8fbd-cc661354fe03-apiservice-cert\") pod \"metallb-operator-controller-manager-5dcc476cd5-nk2zw\" (UID: \"bfc5861c-71cc-4485-8fbd-cc661354fe03\") " pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.834422 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v5n7\" (UniqueName: \"kubernetes.io/projected/bfc5861c-71cc-4485-8fbd-cc661354fe03-kube-api-access-9v5n7\") pod \"metallb-operator-controller-manager-5dcc476cd5-nk2zw\" (UID: \"bfc5861c-71cc-4485-8fbd-cc661354fe03\") " pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.849737 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpmc2"] Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.858569 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d664df476-46xlp"] Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.859370 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.861520 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.861687 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-tvnvf" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.862016 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.876453 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d664df476-46xlp"] Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.957755 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/222e3273-8373-410d-b8f1-fe19aa307ed5-apiservice-cert\") pod \"metallb-operator-webhook-server-5d664df476-46xlp\" (UID: \"222e3273-8373-410d-b8f1-fe19aa307ed5\") " pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.957840 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/222e3273-8373-410d-b8f1-fe19aa307ed5-webhook-cert\") pod \"metallb-operator-webhook-server-5d664df476-46xlp\" (UID: \"222e3273-8373-410d-b8f1-fe19aa307ed5\") " pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:09 crc kubenswrapper[4773]: I0121 15:39:09.957941 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nslr\" (UniqueName: \"kubernetes.io/projected/222e3273-8373-410d-b8f1-fe19aa307ed5-kube-api-access-8nslr\") pod \"metallb-operator-webhook-server-5d664df476-46xlp\" (UID: \"222e3273-8373-410d-b8f1-fe19aa307ed5\") " pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:10 crc kubenswrapper[4773]: I0121 15:39:10.059395 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/222e3273-8373-410d-b8f1-fe19aa307ed5-apiservice-cert\") pod \"metallb-operator-webhook-server-5d664df476-46xlp\" (UID: \"222e3273-8373-410d-b8f1-fe19aa307ed5\") " pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:10 crc kubenswrapper[4773]: I0121 15:39:10.059470 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/222e3273-8373-410d-b8f1-fe19aa307ed5-webhook-cert\") pod \"metallb-operator-webhook-server-5d664df476-46xlp\" (UID: \"222e3273-8373-410d-b8f1-fe19aa307ed5\") " pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:10 crc kubenswrapper[4773]: I0121 15:39:10.059545 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nslr\" (UniqueName: \"kubernetes.io/projected/222e3273-8373-410d-b8f1-fe19aa307ed5-kube-api-access-8nslr\") pod \"metallb-operator-webhook-server-5d664df476-46xlp\" (UID: \"222e3273-8373-410d-b8f1-fe19aa307ed5\") " pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:10 crc kubenswrapper[4773]: I0121 15:39:10.062502 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/222e3273-8373-410d-b8f1-fe19aa307ed5-webhook-cert\") pod \"metallb-operator-webhook-server-5d664df476-46xlp\" (UID: \"222e3273-8373-410d-b8f1-fe19aa307ed5\") " pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:10 crc kubenswrapper[4773]: I0121 15:39:10.068434 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/222e3273-8373-410d-b8f1-fe19aa307ed5-apiservice-cert\") pod \"metallb-operator-webhook-server-5d664df476-46xlp\" (UID: \"222e3273-8373-410d-b8f1-fe19aa307ed5\") " pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:10 crc kubenswrapper[4773]: I0121 15:39:10.080410 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nslr\" (UniqueName: \"kubernetes.io/projected/222e3273-8373-410d-b8f1-fe19aa307ed5-kube-api-access-8nslr\") pod \"metallb-operator-webhook-server-5d664df476-46xlp\" (UID: \"222e3273-8373-410d-b8f1-fe19aa307ed5\") " pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:10 crc kubenswrapper[4773]: I0121 15:39:10.130020 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:10 crc kubenswrapper[4773]: I0121 15:39:10.174388 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:10 crc kubenswrapper[4773]: I0121 15:39:10.325834 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whm2j" event={"ID":"8a466272-ba16-4a82-ace4-1f13f147c3cd","Type":"ContainerStarted","Data":"ad4e33603b191171a9e09ce49177250e25bd131bf6240a8b4e8cba32b276fb95"} Jan 21 15:39:10 crc kubenswrapper[4773]: I0121 15:39:10.638206 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw"] Jan 21 15:39:10 crc kubenswrapper[4773]: I0121 15:39:10.705834 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d664df476-46xlp"] Jan 21 15:39:10 crc kubenswrapper[4773]: W0121 15:39:10.711984 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod222e3273_8373_410d_b8f1_fe19aa307ed5.slice/crio-1e2796d6881b2243b55ae7008f01a6e4ee019c52cce2b0a041a89d6d4c5d1008 WatchSource:0}: Error finding container 1e2796d6881b2243b55ae7008f01a6e4ee019c52cce2b0a041a89d6d4c5d1008: Status 404 returned error can't find the container with id 1e2796d6881b2243b55ae7008f01a6e4ee019c52cce2b0a041a89d6d4c5d1008 Jan 21 15:39:11 crc kubenswrapper[4773]: I0121 15:39:11.353708 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" event={"ID":"222e3273-8373-410d-b8f1-fe19aa307ed5","Type":"ContainerStarted","Data":"1e2796d6881b2243b55ae7008f01a6e4ee019c52cce2b0a041a89d6d4c5d1008"} Jan 21 15:39:11 crc kubenswrapper[4773]: I0121 15:39:11.371499 4773 generic.go:334] "Generic (PLEG): container finished" podID="8a466272-ba16-4a82-ace4-1f13f147c3cd" containerID="ad4e33603b191171a9e09ce49177250e25bd131bf6240a8b4e8cba32b276fb95" exitCode=0 Jan 21 15:39:11 crc kubenswrapper[4773]: I0121 15:39:11.371584 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whm2j" event={"ID":"8a466272-ba16-4a82-ace4-1f13f147c3cd","Type":"ContainerDied","Data":"ad4e33603b191171a9e09ce49177250e25bd131bf6240a8b4e8cba32b276fb95"} Jan 21 15:39:11 crc kubenswrapper[4773]: I0121 15:39:11.377050 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qpmc2" podUID="5dde2d17-a941-4beb-9be7-c265c447685b" containerName="registry-server" containerID="cri-o://c8216e115643a02b4733e2c8a0bb28132ea612d8a025d37974926096a6f03e1c" gracePeriod=2 Jan 21 15:39:11 crc kubenswrapper[4773]: I0121 15:39:11.377188 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" event={"ID":"bfc5861c-71cc-4485-8fbd-cc661354fe03","Type":"ContainerStarted","Data":"1d8b21e898b13abf2651e8d3096081e90369c531084ec0de9b71aa587c1fdc6a"} Jan 21 15:39:12 crc kubenswrapper[4773]: I0121 15:39:12.384667 4773 generic.go:334] "Generic (PLEG): container finished" podID="5dde2d17-a941-4beb-9be7-c265c447685b" containerID="c8216e115643a02b4733e2c8a0bb28132ea612d8a025d37974926096a6f03e1c" exitCode=0 Jan 21 15:39:12 crc kubenswrapper[4773]: I0121 15:39:12.384716 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpmc2" event={"ID":"5dde2d17-a941-4beb-9be7-c265c447685b","Type":"ContainerDied","Data":"c8216e115643a02b4733e2c8a0bb28132ea612d8a025d37974926096a6f03e1c"} Jan 21 15:39:12 crc kubenswrapper[4773]: I0121 15:39:12.880212 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:39:12 crc kubenswrapper[4773]: I0121 15:39:12.999716 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn9s5\" (UniqueName: \"kubernetes.io/projected/5dde2d17-a941-4beb-9be7-c265c447685b-kube-api-access-wn9s5\") pod \"5dde2d17-a941-4beb-9be7-c265c447685b\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " Jan 21 15:39:12 crc kubenswrapper[4773]: I0121 15:39:12.999814 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-catalog-content\") pod \"5dde2d17-a941-4beb-9be7-c265c447685b\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " Jan 21 15:39:12 crc kubenswrapper[4773]: I0121 15:39:12.999869 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-utilities\") pod \"5dde2d17-a941-4beb-9be7-c265c447685b\" (UID: \"5dde2d17-a941-4beb-9be7-c265c447685b\") " Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.002214 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-utilities" (OuterVolumeSpecName: "utilities") pod "5dde2d17-a941-4beb-9be7-c265c447685b" (UID: "5dde2d17-a941-4beb-9be7-c265c447685b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.024391 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dde2d17-a941-4beb-9be7-c265c447685b-kube-api-access-wn9s5" (OuterVolumeSpecName: "kube-api-access-wn9s5") pod "5dde2d17-a941-4beb-9be7-c265c447685b" (UID: "5dde2d17-a941-4beb-9be7-c265c447685b"). InnerVolumeSpecName "kube-api-access-wn9s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.055635 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dde2d17-a941-4beb-9be7-c265c447685b" (UID: "5dde2d17-a941-4beb-9be7-c265c447685b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.103675 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn9s5\" (UniqueName: \"kubernetes.io/projected/5dde2d17-a941-4beb-9be7-c265c447685b-kube-api-access-wn9s5\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.103740 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.103762 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dde2d17-a941-4beb-9be7-c265c447685b-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.372669 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.373033 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.395060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whm2j" event={"ID":"8a466272-ba16-4a82-ace4-1f13f147c3cd","Type":"ContainerStarted","Data":"6c7adc1108fbc765b7497c985e40c007e5cbfb806fb52d8ff3c6c39c3811040f"} Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.400106 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpmc2" event={"ID":"5dde2d17-a941-4beb-9be7-c265c447685b","Type":"ContainerDied","Data":"09af5d9c1d753911225ae8279c95e30473e497fb8154e65b701bc6730ebf3f17"} Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.400163 4773 scope.go:117] "RemoveContainer" containerID="c8216e115643a02b4733e2c8a0bb28132ea612d8a025d37974926096a6f03e1c" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.400296 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpmc2" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.426626 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-whm2j" podStartSLOduration=3.40069598 podStartE2EDuration="7.426604205s" podCreationTimestamp="2026-01-21 15:39:06 +0000 UTC" firstStartedPulling="2026-01-21 15:39:08.295237739 +0000 UTC m=+913.219727361" lastFinishedPulling="2026-01-21 15:39:12.321145974 +0000 UTC m=+917.245635586" observedRunningTime="2026-01-21 15:39:13.41301471 +0000 UTC m=+918.337504332" watchObservedRunningTime="2026-01-21 15:39:13.426604205 +0000 UTC m=+918.351093837" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.433948 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.437415 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpmc2"] Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.445527 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qpmc2"] Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.447420 4773 scope.go:117] "RemoveContainer" containerID="7c1ae240bd1b47fdcc8307649346bf1ddcb21609e7dea22651c327f104c2d961" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.479036 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:13 crc kubenswrapper[4773]: I0121 15:39:13.504947 4773 scope.go:117] "RemoveContainer" containerID="1b96acf4681824dd61dd1e5bbd5b3246eaddc0c54851a31d4c6fe19af506be29" Jan 21 15:39:15 crc kubenswrapper[4773]: I0121 15:39:15.392560 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dde2d17-a941-4beb-9be7-c265c447685b" path="/var/lib/kubelet/pods/5dde2d17-a941-4beb-9be7-c265c447685b/volumes" Jan 21 15:39:16 crc kubenswrapper[4773]: I0121 15:39:16.444648 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gktx"] Jan 21 15:39:16 crc kubenswrapper[4773]: I0121 15:39:16.444912 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4gktx" podUID="76180b21-8f56-4ed8-8989-d100960852dd" containerName="registry-server" containerID="cri-o://cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf" gracePeriod=2 Jan 21 15:39:16 crc kubenswrapper[4773]: I0121 15:39:16.978067 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:16 crc kubenswrapper[4773]: I0121 15:39:16.978169 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:17 crc kubenswrapper[4773]: I0121 15:39:17.018720 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:17 crc kubenswrapper[4773]: I0121 15:39:17.469259 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:20 crc kubenswrapper[4773]: I0121 15:39:20.048728 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-whm2j"] Jan 21 15:39:20 crc kubenswrapper[4773]: I0121 15:39:20.049013 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-whm2j" podUID="8a466272-ba16-4a82-ace4-1f13f147c3cd" containerName="registry-server" containerID="cri-o://6c7adc1108fbc765b7497c985e40c007e5cbfb806fb52d8ff3c6c39c3811040f" gracePeriod=2 Jan 21 15:39:21 crc kubenswrapper[4773]: I0121 15:39:21.459051 4773 generic.go:334] "Generic (PLEG): container finished" podID="8a466272-ba16-4a82-ace4-1f13f147c3cd" containerID="6c7adc1108fbc765b7497c985e40c007e5cbfb806fb52d8ff3c6c39c3811040f" exitCode=0 Jan 21 15:39:21 crc kubenswrapper[4773]: I0121 15:39:21.459146 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whm2j" event={"ID":"8a466272-ba16-4a82-ace4-1f13f147c3cd","Type":"ContainerDied","Data":"6c7adc1108fbc765b7497c985e40c007e5cbfb806fb52d8ff3c6c39c3811040f"} Jan 21 15:39:21 crc kubenswrapper[4773]: I0121 15:39:21.467164 4773 generic.go:334] "Generic (PLEG): container finished" podID="76180b21-8f56-4ed8-8989-d100960852dd" containerID="cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf" exitCode=0 Jan 21 15:39:21 crc kubenswrapper[4773]: I0121 15:39:21.467233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gktx" event={"ID":"76180b21-8f56-4ed8-8989-d100960852dd","Type":"ContainerDied","Data":"cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf"} Jan 21 15:39:23 crc kubenswrapper[4773]: E0121 15:39:23.373766 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf is running failed: container process not found" containerID="cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 15:39:23 crc kubenswrapper[4773]: E0121 15:39:23.375265 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf is running failed: container process not found" containerID="cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 15:39:23 crc kubenswrapper[4773]: E0121 15:39:23.375879 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf is running failed: container process not found" containerID="cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 15:39:23 crc kubenswrapper[4773]: E0121 15:39:23.375968 4773 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-4gktx" podUID="76180b21-8f56-4ed8-8989-d100960852dd" containerName="registry-server" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.470274 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.490404 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gktx" event={"ID":"76180b21-8f56-4ed8-8989-d100960852dd","Type":"ContainerDied","Data":"f52d59d5ed65fa5765e519ff2d7b51d14219f018fb2db4de0effc9d5ebbbcbf6"} Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.490456 4773 scope.go:117] "RemoveContainer" containerID="cf4ab5970f7ab2e8e08ce5fc5a8350d5acfe0677c98200c98270546c6c8f18bf" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.490636 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gktx" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.538939 4773 scope.go:117] "RemoveContainer" containerID="70e97ef0e2617fb6dbcf47811ad2c5273e2ecdaa3598da5eb8f859b155f83456" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.558230 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-utilities\") pod \"76180b21-8f56-4ed8-8989-d100960852dd\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.558301 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-catalog-content\") pod \"76180b21-8f56-4ed8-8989-d100960852dd\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.558327 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr97k\" (UniqueName: \"kubernetes.io/projected/76180b21-8f56-4ed8-8989-d100960852dd-kube-api-access-tr97k\") pod \"76180b21-8f56-4ed8-8989-d100960852dd\" (UID: \"76180b21-8f56-4ed8-8989-d100960852dd\") " Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.562252 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-utilities" (OuterVolumeSpecName: "utilities") pod "76180b21-8f56-4ed8-8989-d100960852dd" (UID: "76180b21-8f56-4ed8-8989-d100960852dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.564893 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76180b21-8f56-4ed8-8989-d100960852dd-kube-api-access-tr97k" (OuterVolumeSpecName: "kube-api-access-tr97k") pod "76180b21-8f56-4ed8-8989-d100960852dd" (UID: "76180b21-8f56-4ed8-8989-d100960852dd"). InnerVolumeSpecName "kube-api-access-tr97k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.582231 4773 scope.go:117] "RemoveContainer" containerID="7e0f5161385e42da70720eb3ec0214c20be017a808db3648446cf1f553a35cda" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.604244 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76180b21-8f56-4ed8-8989-d100960852dd" (UID: "76180b21-8f56-4ed8-8989-d100960852dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.660126 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.660158 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr97k\" (UniqueName: \"kubernetes.io/projected/76180b21-8f56-4ed8-8989-d100960852dd-kube-api-access-tr97k\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.660169 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76180b21-8f56-4ed8-8989-d100960852dd-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.725329 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.831158 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gktx"] Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.834782 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4gktx"] Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.862297 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-catalog-content\") pod \"8a466272-ba16-4a82-ace4-1f13f147c3cd\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.862363 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6bhs\" (UniqueName: \"kubernetes.io/projected/8a466272-ba16-4a82-ace4-1f13f147c3cd-kube-api-access-r6bhs\") pod \"8a466272-ba16-4a82-ace4-1f13f147c3cd\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.862448 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-utilities\") pod \"8a466272-ba16-4a82-ace4-1f13f147c3cd\" (UID: \"8a466272-ba16-4a82-ace4-1f13f147c3cd\") " Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.863428 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-utilities" (OuterVolumeSpecName: "utilities") pod "8a466272-ba16-4a82-ace4-1f13f147c3cd" (UID: "8a466272-ba16-4a82-ace4-1f13f147c3cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.867890 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a466272-ba16-4a82-ace4-1f13f147c3cd-kube-api-access-r6bhs" (OuterVolumeSpecName: "kube-api-access-r6bhs") pod "8a466272-ba16-4a82-ace4-1f13f147c3cd" (UID: "8a466272-ba16-4a82-ace4-1f13f147c3cd"). InnerVolumeSpecName "kube-api-access-r6bhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.898954 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a466272-ba16-4a82-ace4-1f13f147c3cd" (UID: "8a466272-ba16-4a82-ace4-1f13f147c3cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.963825 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6bhs\" (UniqueName: \"kubernetes.io/projected/8a466272-ba16-4a82-ace4-1f13f147c3cd-kube-api-access-r6bhs\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.963863 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:23 crc kubenswrapper[4773]: I0121 15:39:23.963873 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a466272-ba16-4a82-ace4-1f13f147c3cd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.499003 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" event={"ID":"bfc5861c-71cc-4485-8fbd-cc661354fe03","Type":"ContainerStarted","Data":"aa7285e5731cbc9a2cb99d963ada37c331cb13e70c84423c583ae6f54aca8a1d"} Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.499910 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.500206 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" event={"ID":"222e3273-8373-410d-b8f1-fe19aa307ed5","Type":"ContainerStarted","Data":"b5cf6af304ad78a3c303d1d5ba58eabde5e19580067a8fc90eab0a1269da9e4c"} Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.500616 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.502468 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whm2j" event={"ID":"8a466272-ba16-4a82-ace4-1f13f147c3cd","Type":"ContainerDied","Data":"6593f2f8473e1e34189d2060f0337f315ff5a6bbe91c8414f4ee6badfadba8e8"} Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.502503 4773 scope.go:117] "RemoveContainer" containerID="6c7adc1108fbc765b7497c985e40c007e5cbfb806fb52d8ff3c6c39c3811040f" Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.502534 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whm2j" Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.518471 4773 scope.go:117] "RemoveContainer" containerID="ad4e33603b191171a9e09ce49177250e25bd131bf6240a8b4e8cba32b276fb95" Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.534951 4773 scope.go:117] "RemoveContainer" containerID="6676e8c5646ec846dea71b79162dbe1108973c68b617a31cc358aa19d1728b4b" Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.555707 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" podStartSLOduration=2.713398041 podStartE2EDuration="15.555673708s" podCreationTimestamp="2026-01-21 15:39:09 +0000 UTC" firstStartedPulling="2026-01-21 15:39:10.639031381 +0000 UTC m=+915.563521003" lastFinishedPulling="2026-01-21 15:39:23.481307048 +0000 UTC m=+928.405796670" observedRunningTime="2026-01-21 15:39:24.535268891 +0000 UTC m=+929.459758513" watchObservedRunningTime="2026-01-21 15:39:24.555673708 +0000 UTC m=+929.480163330" Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.559590 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-whm2j"] Jan 21 15:39:24 crc kubenswrapper[4773]: I0121 15:39:24.560134 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-whm2j"] Jan 21 15:39:25 crc kubenswrapper[4773]: I0121 15:39:25.206385 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:39:25 crc kubenswrapper[4773]: I0121 15:39:25.206754 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:39:25 crc kubenswrapper[4773]: I0121 15:39:25.392540 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76180b21-8f56-4ed8-8989-d100960852dd" path="/var/lib/kubelet/pods/76180b21-8f56-4ed8-8989-d100960852dd/volumes" Jan 21 15:39:25 crc kubenswrapper[4773]: I0121 15:39:25.393476 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a466272-ba16-4a82-ace4-1f13f147c3cd" path="/var/lib/kubelet/pods/8a466272-ba16-4a82-ace4-1f13f147c3cd/volumes" Jan 21 15:39:40 crc kubenswrapper[4773]: I0121 15:39:40.180451 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" Jan 21 15:39:40 crc kubenswrapper[4773]: I0121 15:39:40.229003 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5d664df476-46xlp" podStartSLOduration=18.403519225 podStartE2EDuration="31.228984607s" podCreationTimestamp="2026-01-21 15:39:09 +0000 UTC" firstStartedPulling="2026-01-21 15:39:10.7138996 +0000 UTC m=+915.638389222" lastFinishedPulling="2026-01-21 15:39:23.539364982 +0000 UTC m=+928.463854604" observedRunningTime="2026-01-21 15:39:24.574310286 +0000 UTC m=+929.498799938" watchObservedRunningTime="2026-01-21 15:39:40.228984607 +0000 UTC m=+945.153474239" Jan 21 15:39:55 crc kubenswrapper[4773]: I0121 15:39:55.205916 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:39:55 crc kubenswrapper[4773]: I0121 15:39:55.206435 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.134458 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.902419 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-dtvhb"] Jan 21 15:40:00 crc kubenswrapper[4773]: E0121 15:40:00.902835 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dde2d17-a941-4beb-9be7-c265c447685b" containerName="registry-server" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.902858 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dde2d17-a941-4beb-9be7-c265c447685b" containerName="registry-server" Jan 21 15:40:00 crc kubenswrapper[4773]: E0121 15:40:00.902871 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a466272-ba16-4a82-ace4-1f13f147c3cd" containerName="extract-utilities" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.902880 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a466272-ba16-4a82-ace4-1f13f147c3cd" containerName="extract-utilities" Jan 21 15:40:00 crc kubenswrapper[4773]: E0121 15:40:00.902898 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a466272-ba16-4a82-ace4-1f13f147c3cd" containerName="extract-content" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.902907 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a466272-ba16-4a82-ace4-1f13f147c3cd" containerName="extract-content" Jan 21 15:40:00 crc kubenswrapper[4773]: E0121 15:40:00.902920 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dde2d17-a941-4beb-9be7-c265c447685b" containerName="extract-content" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.902927 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dde2d17-a941-4beb-9be7-c265c447685b" containerName="extract-content" Jan 21 15:40:00 crc kubenswrapper[4773]: E0121 15:40:00.902941 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a466272-ba16-4a82-ace4-1f13f147c3cd" containerName="registry-server" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.902948 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a466272-ba16-4a82-ace4-1f13f147c3cd" containerName="registry-server" Jan 21 15:40:00 crc kubenswrapper[4773]: E0121 15:40:00.902957 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76180b21-8f56-4ed8-8989-d100960852dd" containerName="extract-content" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.902965 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="76180b21-8f56-4ed8-8989-d100960852dd" containerName="extract-content" Jan 21 15:40:00 crc kubenswrapper[4773]: E0121 15:40:00.902990 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dde2d17-a941-4beb-9be7-c265c447685b" containerName="extract-utilities" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.903007 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dde2d17-a941-4beb-9be7-c265c447685b" containerName="extract-utilities" Jan 21 15:40:00 crc kubenswrapper[4773]: E0121 15:40:00.903020 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76180b21-8f56-4ed8-8989-d100960852dd" containerName="extract-utilities" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.903030 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="76180b21-8f56-4ed8-8989-d100960852dd" containerName="extract-utilities" Jan 21 15:40:00 crc kubenswrapper[4773]: E0121 15:40:00.903041 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76180b21-8f56-4ed8-8989-d100960852dd" containerName="registry-server" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.903049 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="76180b21-8f56-4ed8-8989-d100960852dd" containerName="registry-server" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.903189 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a466272-ba16-4a82-ace4-1f13f147c3cd" containerName="registry-server" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.903201 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="76180b21-8f56-4ed8-8989-d100960852dd" containerName="registry-server" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.903217 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dde2d17-a941-4beb-9be7-c265c447685b" containerName="registry-server" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.905890 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.906791 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84"] Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.907480 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.908155 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.908541 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-bvxvk" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.908805 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.909016 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.913446 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-frr-sockets\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.913520 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d536346-20d9-48b7-92d2-dd043c7cca4a-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-bqp84\" (UID: \"1d536346-20d9-48b7-92d2-dd043c7cca4a\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.913572 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-reloader\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.913641 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4tsn\" (UniqueName: \"kubernetes.io/projected/1d536346-20d9-48b7-92d2-dd043c7cca4a-kube-api-access-m4tsn\") pod \"frr-k8s-webhook-server-7df86c4f6c-bqp84\" (UID: \"1d536346-20d9-48b7-92d2-dd043c7cca4a\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.913772 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-frr-conf\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.913804 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rct4k\" (UniqueName: \"kubernetes.io/projected/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-kube-api-access-rct4k\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.913851 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-frr-startup\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.913906 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-metrics-certs\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.913947 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-metrics\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.927145 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84"] Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.992612 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wptww"] Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.993717 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wptww" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.995763 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.997002 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zz8rz" Jan 21 15:40:00 crc kubenswrapper[4773]: I0121 15:40:00.999767 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.004511 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-vsxqt"] Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.005640 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.007351 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.007351 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014543 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-reloader\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014581 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e29da82-0979-40a0-8b48-4ba06d87fd14-cert\") pod \"controller-6968d8fdc4-vsxqt\" (UID: \"0e29da82-0979-40a0-8b48-4ba06d87fd14\") " pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014610 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e29da82-0979-40a0-8b48-4ba06d87fd14-metrics-certs\") pod \"controller-6968d8fdc4-vsxqt\" (UID: \"0e29da82-0979-40a0-8b48-4ba06d87fd14\") " pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014641 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4tsn\" (UniqueName: \"kubernetes.io/projected/1d536346-20d9-48b7-92d2-dd043c7cca4a-kube-api-access-m4tsn\") pod \"frr-k8s-webhook-server-7df86c4f6c-bqp84\" (UID: \"1d536346-20d9-48b7-92d2-dd043c7cca4a\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014659 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-frr-conf\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014675 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rct4k\" (UniqueName: \"kubernetes.io/projected/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-kube-api-access-rct4k\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014709 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-memberlist\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014742 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-frr-startup\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014756 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn74w\" (UniqueName: \"kubernetes.io/projected/0e29da82-0979-40a0-8b48-4ba06d87fd14-kube-api-access-jn74w\") pod \"controller-6968d8fdc4-vsxqt\" (UID: \"0e29da82-0979-40a0-8b48-4ba06d87fd14\") " pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014791 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-metrics-certs\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014814 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-metrics-certs\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014846 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0968840f-f0d5-4b41-8f6f-00b88d26758e-metallb-excludel2\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014870 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-metrics\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014896 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-frr-sockets\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014924 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d536346-20d9-48b7-92d2-dd043c7cca4a-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-bqp84\" (UID: \"1d536346-20d9-48b7-92d2-dd043c7cca4a\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.014946 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42tgk\" (UniqueName: \"kubernetes.io/projected/0968840f-f0d5-4b41-8f6f-00b88d26758e-kube-api-access-42tgk\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.015274 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-reloader\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.015666 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-frr-conf\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.015676 4773 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.015839 4773 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.015968 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-metrics\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.015978 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-metrics-certs podName:e336cc2c-2e1a-4d7b-b516-bc360cee8c4b nodeName:}" failed. No retries permitted until 2026-01-21 15:40:01.51588276 +0000 UTC m=+966.440372382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-metrics-certs") pod "frr-k8s-dtvhb" (UID: "e336cc2c-2e1a-4d7b-b516-bc360cee8c4b") : secret "frr-k8s-certs-secret" not found Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.016155 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d536346-20d9-48b7-92d2-dd043c7cca4a-cert podName:1d536346-20d9-48b7-92d2-dd043c7cca4a nodeName:}" failed. No retries permitted until 2026-01-21 15:40:01.516143078 +0000 UTC m=+966.440632730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d536346-20d9-48b7-92d2-dd043c7cca4a-cert") pod "frr-k8s-webhook-server-7df86c4f6c-bqp84" (UID: "1d536346-20d9-48b7-92d2-dd043c7cca4a") : secret "frr-k8s-webhook-server-cert" not found Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.016083 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-frr-sockets\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.016586 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-frr-startup\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.038061 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-vsxqt"] Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.055027 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rct4k\" (UniqueName: \"kubernetes.io/projected/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-kube-api-access-rct4k\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.072485 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4tsn\" (UniqueName: \"kubernetes.io/projected/1d536346-20d9-48b7-92d2-dd043c7cca4a-kube-api-access-m4tsn\") pod \"frr-k8s-webhook-server-7df86c4f6c-bqp84\" (UID: \"1d536346-20d9-48b7-92d2-dd043c7cca4a\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.116077 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e29da82-0979-40a0-8b48-4ba06d87fd14-cert\") pod \"controller-6968d8fdc4-vsxqt\" (UID: \"0e29da82-0979-40a0-8b48-4ba06d87fd14\") " pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.116129 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e29da82-0979-40a0-8b48-4ba06d87fd14-metrics-certs\") pod \"controller-6968d8fdc4-vsxqt\" (UID: \"0e29da82-0979-40a0-8b48-4ba06d87fd14\") " pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.116164 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-memberlist\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.116188 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn74w\" (UniqueName: \"kubernetes.io/projected/0e29da82-0979-40a0-8b48-4ba06d87fd14-kube-api-access-jn74w\") pod \"controller-6968d8fdc4-vsxqt\" (UID: \"0e29da82-0979-40a0-8b48-4ba06d87fd14\") " pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.116228 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-metrics-certs\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.116245 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0968840f-f0d5-4b41-8f6f-00b88d26758e-metallb-excludel2\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.116276 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42tgk\" (UniqueName: \"kubernetes.io/projected/0968840f-f0d5-4b41-8f6f-00b88d26758e-kube-api-access-42tgk\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.116671 4773 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.116731 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e29da82-0979-40a0-8b48-4ba06d87fd14-metrics-certs podName:0e29da82-0979-40a0-8b48-4ba06d87fd14 nodeName:}" failed. No retries permitted until 2026-01-21 15:40:01.616718015 +0000 UTC m=+966.541207637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e29da82-0979-40a0-8b48-4ba06d87fd14-metrics-certs") pod "controller-6968d8fdc4-vsxqt" (UID: "0e29da82-0979-40a0-8b48-4ba06d87fd14") : secret "controller-certs-secret" not found Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.116883 4773 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.116912 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-memberlist podName:0968840f-f0d5-4b41-8f6f-00b88d26758e nodeName:}" failed. No retries permitted until 2026-01-21 15:40:01.61690506 +0000 UTC m=+966.541394682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-memberlist") pod "speaker-wptww" (UID: "0968840f-f0d5-4b41-8f6f-00b88d26758e") : secret "metallb-memberlist" not found Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.117044 4773 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.117070 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-metrics-certs podName:0968840f-f0d5-4b41-8f6f-00b88d26758e nodeName:}" failed. No retries permitted until 2026-01-21 15:40:01.617064104 +0000 UTC m=+966.541553726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-metrics-certs") pod "speaker-wptww" (UID: "0968840f-f0d5-4b41-8f6f-00b88d26758e") : secret "speaker-certs-secret" not found Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.117630 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0968840f-f0d5-4b41-8f6f-00b88d26758e-metallb-excludel2\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.119277 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.131057 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e29da82-0979-40a0-8b48-4ba06d87fd14-cert\") pod \"controller-6968d8fdc4-vsxqt\" (UID: \"0e29da82-0979-40a0-8b48-4ba06d87fd14\") " pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.136480 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn74w\" (UniqueName: \"kubernetes.io/projected/0e29da82-0979-40a0-8b48-4ba06d87fd14-kube-api-access-jn74w\") pod \"controller-6968d8fdc4-vsxqt\" (UID: \"0e29da82-0979-40a0-8b48-4ba06d87fd14\") " pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.143078 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42tgk\" (UniqueName: \"kubernetes.io/projected/0968840f-f0d5-4b41-8f6f-00b88d26758e-kube-api-access-42tgk\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.522015 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d536346-20d9-48b7-92d2-dd043c7cca4a-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-bqp84\" (UID: \"1d536346-20d9-48b7-92d2-dd043c7cca4a\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.522141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-metrics-certs\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.525050 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e336cc2c-2e1a-4d7b-b516-bc360cee8c4b-metrics-certs\") pod \"frr-k8s-dtvhb\" (UID: \"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b\") " pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.527669 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.531789 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d536346-20d9-48b7-92d2-dd043c7cca4a-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-bqp84\" (UID: \"1d536346-20d9-48b7-92d2-dd043c7cca4a\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.543925 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.623582 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e29da82-0979-40a0-8b48-4ba06d87fd14-metrics-certs\") pod \"controller-6968d8fdc4-vsxqt\" (UID: \"0e29da82-0979-40a0-8b48-4ba06d87fd14\") " pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.623929 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-memberlist\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.624015 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-metrics-certs\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.625029 4773 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 21 15:40:01 crc kubenswrapper[4773]: E0121 15:40:01.625184 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-memberlist podName:0968840f-f0d5-4b41-8f6f-00b88d26758e nodeName:}" failed. No retries permitted until 2026-01-21 15:40:02.625131325 +0000 UTC m=+967.549620987 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-memberlist") pod "speaker-wptww" (UID: "0968840f-f0d5-4b41-8f6f-00b88d26758e") : secret "metallb-memberlist" not found Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.631556 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e29da82-0979-40a0-8b48-4ba06d87fd14-metrics-certs\") pod \"controller-6968d8fdc4-vsxqt\" (UID: \"0e29da82-0979-40a0-8b48-4ba06d87fd14\") " pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.632146 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-metrics-certs\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.925095 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:01 crc kubenswrapper[4773]: I0121 15:40:01.956466 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84"] Jan 21 15:40:02 crc kubenswrapper[4773]: I0121 15:40:02.137793 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-vsxqt"] Jan 21 15:40:02 crc kubenswrapper[4773]: W0121 15:40:02.149663 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e29da82_0979_40a0_8b48_4ba06d87fd14.slice/crio-a2d293ff7405f296a5d272f0d7963596e70ef48822a72c3cd7fac76bbdf98fbb WatchSource:0}: Error finding container a2d293ff7405f296a5d272f0d7963596e70ef48822a72c3cd7fac76bbdf98fbb: Status 404 returned error can't find the container with id a2d293ff7405f296a5d272f0d7963596e70ef48822a72c3cd7fac76bbdf98fbb Jan 21 15:40:02 crc kubenswrapper[4773]: I0121 15:40:02.643146 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-memberlist\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:02 crc kubenswrapper[4773]: I0121 15:40:02.649203 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0968840f-f0d5-4b41-8f6f-00b88d26758e-memberlist\") pod \"speaker-wptww\" (UID: \"0968840f-f0d5-4b41-8f6f-00b88d26758e\") " pod="metallb-system/speaker-wptww" Jan 21 15:40:02 crc kubenswrapper[4773]: I0121 15:40:02.742409 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtvhb" event={"ID":"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b","Type":"ContainerStarted","Data":"d71b767722ac8b8bc6cc1826284709feda81e82f90a5774b73743a9d1a30ef1b"} Jan 21 15:40:02 crc kubenswrapper[4773]: I0121 15:40:02.745074 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-vsxqt" event={"ID":"0e29da82-0979-40a0-8b48-4ba06d87fd14","Type":"ContainerStarted","Data":"9243e36ee04952d392a0d376d57bad500b1dcdf7fc374e78cc3f9a1c8ac8ea9c"} Jan 21 15:40:02 crc kubenswrapper[4773]: I0121 15:40:02.745145 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-vsxqt" event={"ID":"0e29da82-0979-40a0-8b48-4ba06d87fd14","Type":"ContainerStarted","Data":"d9a4b69dee7784263096084b7f0c6eb2a54e6d340fc8a71d1b54b0093a7af64f"} Jan 21 15:40:02 crc kubenswrapper[4773]: I0121 15:40:02.745162 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-vsxqt" event={"ID":"0e29da82-0979-40a0-8b48-4ba06d87fd14","Type":"ContainerStarted","Data":"a2d293ff7405f296a5d272f0d7963596e70ef48822a72c3cd7fac76bbdf98fbb"} Jan 21 15:40:02 crc kubenswrapper[4773]: I0121 15:40:02.745223 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:02 crc kubenswrapper[4773]: I0121 15:40:02.747028 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" event={"ID":"1d536346-20d9-48b7-92d2-dd043c7cca4a","Type":"ContainerStarted","Data":"f298700c9d16839abc291b654003bbae7b0f7a0ca98c1ed2152ec2511b57abb9"} Jan 21 15:40:02 crc kubenswrapper[4773]: I0121 15:40:02.766461 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-vsxqt" podStartSLOduration=2.766441111 podStartE2EDuration="2.766441111s" podCreationTimestamp="2026-01-21 15:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:40:02.763447456 +0000 UTC m=+967.687937078" watchObservedRunningTime="2026-01-21 15:40:02.766441111 +0000 UTC m=+967.690930743" Jan 21 15:40:02 crc kubenswrapper[4773]: I0121 15:40:02.808408 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wptww" Jan 21 15:40:03 crc kubenswrapper[4773]: I0121 15:40:03.756340 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wptww" event={"ID":"0968840f-f0d5-4b41-8f6f-00b88d26758e","Type":"ContainerStarted","Data":"ae92b551d09865802fcb063d9581fd0b1feaa2a50eb309a4150db8b800a5d817"} Jan 21 15:40:03 crc kubenswrapper[4773]: I0121 15:40:03.756724 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wptww" event={"ID":"0968840f-f0d5-4b41-8f6f-00b88d26758e","Type":"ContainerStarted","Data":"169fe3fd0a3db8a05d548d105755d2ec1a22efc526f460edfea26974f4607207"} Jan 21 15:40:03 crc kubenswrapper[4773]: I0121 15:40:03.756739 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wptww" event={"ID":"0968840f-f0d5-4b41-8f6f-00b88d26758e","Type":"ContainerStarted","Data":"52988525b13f8cf14e202449a697b461ead4389927d1acf327d18c2c86b84163"} Jan 21 15:40:03 crc kubenswrapper[4773]: I0121 15:40:03.756995 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wptww" Jan 21 15:40:03 crc kubenswrapper[4773]: I0121 15:40:03.787314 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wptww" podStartSLOduration=3.7872909359999998 podStartE2EDuration="3.787290936s" podCreationTimestamp="2026-01-21 15:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:40:03.781151763 +0000 UTC m=+968.705641395" watchObservedRunningTime="2026-01-21 15:40:03.787290936 +0000 UTC m=+968.711780558" Jan 21 15:40:10 crc kubenswrapper[4773]: I0121 15:40:10.814587 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" event={"ID":"1d536346-20d9-48b7-92d2-dd043c7cca4a","Type":"ContainerStarted","Data":"c9b5333168c68fa4f98cf67652c42a8fa62f84a07cb61a41fd631db0442c01e3"} Jan 21 15:40:10 crc kubenswrapper[4773]: I0121 15:40:10.815295 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" Jan 21 15:40:10 crc kubenswrapper[4773]: I0121 15:40:10.817103 4773 generic.go:334] "Generic (PLEG): container finished" podID="e336cc2c-2e1a-4d7b-b516-bc360cee8c4b" containerID="59e4d5dc991083ff66bdf154d8603697dde7f09712857ca1ccdf78c1d814b1ac" exitCode=0 Jan 21 15:40:10 crc kubenswrapper[4773]: I0121 15:40:10.817243 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtvhb" event={"ID":"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b","Type":"ContainerDied","Data":"59e4d5dc991083ff66bdf154d8603697dde7f09712857ca1ccdf78c1d814b1ac"} Jan 21 15:40:10 crc kubenswrapper[4773]: I0121 15:40:10.836540 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" podStartSLOduration=2.326588514 podStartE2EDuration="10.836526333s" podCreationTimestamp="2026-01-21 15:40:00 +0000 UTC" firstStartedPulling="2026-01-21 15:40:01.967931949 +0000 UTC m=+966.892421571" lastFinishedPulling="2026-01-21 15:40:10.477869768 +0000 UTC m=+975.402359390" observedRunningTime="2026-01-21 15:40:10.830430215 +0000 UTC m=+975.754919827" watchObservedRunningTime="2026-01-21 15:40:10.836526333 +0000 UTC m=+975.761015955" Jan 21 15:40:11 crc kubenswrapper[4773]: I0121 15:40:11.826435 4773 generic.go:334] "Generic (PLEG): container finished" podID="e336cc2c-2e1a-4d7b-b516-bc360cee8c4b" containerID="e35d9fbfad5a13a6dbb5ebf39a9bbe9e0868c00acad94cb1dfc0494ca652ee81" exitCode=0 Jan 21 15:40:11 crc kubenswrapper[4773]: I0121 15:40:11.826499 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtvhb" event={"ID":"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b","Type":"ContainerDied","Data":"e35d9fbfad5a13a6dbb5ebf39a9bbe9e0868c00acad94cb1dfc0494ca652ee81"} Jan 21 15:40:12 crc kubenswrapper[4773]: I0121 15:40:12.834979 4773 generic.go:334] "Generic (PLEG): container finished" podID="e336cc2c-2e1a-4d7b-b516-bc360cee8c4b" containerID="3f646df7586e31cc7ca1ea0f3ffe3aa4f44268fcc8a592400d5408ea6d321404" exitCode=0 Jan 21 15:40:12 crc kubenswrapper[4773]: I0121 15:40:12.835088 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtvhb" event={"ID":"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b","Type":"ContainerDied","Data":"3f646df7586e31cc7ca1ea0f3ffe3aa4f44268fcc8a592400d5408ea6d321404"} Jan 21 15:40:13 crc kubenswrapper[4773]: I0121 15:40:13.851384 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtvhb" event={"ID":"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b","Type":"ContainerStarted","Data":"fb33e1f4408fa0cdc64fbd53051f59565026d72714bdd0dc07c02d03fbe273d0"} Jan 21 15:40:13 crc kubenswrapper[4773]: I0121 15:40:13.851770 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtvhb" event={"ID":"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b","Type":"ContainerStarted","Data":"d3dff2a481ecd54f452f32005af5c97fb49111d27ed8316d6473f46b6adb722a"} Jan 21 15:40:13 crc kubenswrapper[4773]: I0121 15:40:13.851789 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtvhb" event={"ID":"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b","Type":"ContainerStarted","Data":"6ff9dd644c91164d927dc2e05c0c8ff170996fb314ed8f0d9525c830a0ec9d9a"} Jan 21 15:40:13 crc kubenswrapper[4773]: I0121 15:40:13.851802 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtvhb" event={"ID":"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b","Type":"ContainerStarted","Data":"f4927527ff4843a7e533553a26e755dc3d36a6c3225c43796a38d6b7a5dc255f"} Jan 21 15:40:13 crc kubenswrapper[4773]: I0121 15:40:13.851814 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtvhb" event={"ID":"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b","Type":"ContainerStarted","Data":"5116a9a0b4ac4abcef3563420c3bea414210f99836a531f4761613dfeae93da6"} Jan 21 15:40:14 crc kubenswrapper[4773]: I0121 15:40:14.861555 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dtvhb" event={"ID":"e336cc2c-2e1a-4d7b-b516-bc360cee8c4b","Type":"ContainerStarted","Data":"65aa9b95308a52386adf1ab173807236d4d1dcbcb38496026f944e8a1d60e8c9"} Jan 21 15:40:14 crc kubenswrapper[4773]: I0121 15:40:14.862612 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:14 crc kubenswrapper[4773]: I0121 15:40:14.884546 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-dtvhb" podStartSLOduration=6.775454807 podStartE2EDuration="14.884529955s" podCreationTimestamp="2026-01-21 15:40:00 +0000 UTC" firstStartedPulling="2026-01-21 15:40:02.38522907 +0000 UTC m=+967.309718692" lastFinishedPulling="2026-01-21 15:40:10.494304218 +0000 UTC m=+975.418793840" observedRunningTime="2026-01-21 15:40:14.882238802 +0000 UTC m=+979.806728434" watchObservedRunningTime="2026-01-21 15:40:14.884529955 +0000 UTC m=+979.809019577" Jan 21 15:40:16 crc kubenswrapper[4773]: I0121 15:40:16.528638 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:16 crc kubenswrapper[4773]: I0121 15:40:16.577047 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:21 crc kubenswrapper[4773]: I0121 15:40:21.549836 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bqp84" Jan 21 15:40:21 crc kubenswrapper[4773]: I0121 15:40:21.931650 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-vsxqt" Jan 21 15:40:22 crc kubenswrapper[4773]: I0121 15:40:22.812094 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wptww" Jan 21 15:40:25 crc kubenswrapper[4773]: I0121 15:40:25.206488 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:40:25 crc kubenswrapper[4773]: I0121 15:40:25.206871 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:40:25 crc kubenswrapper[4773]: I0121 15:40:25.206916 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:40:25 crc kubenswrapper[4773]: I0121 15:40:25.207522 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac9fadc09282233e8c4f18266ba6204c80ab33ee79a6058a1eff20ea540a3140"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:40:25 crc kubenswrapper[4773]: I0121 15:40:25.207609 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://ac9fadc09282233e8c4f18266ba6204c80ab33ee79a6058a1eff20ea540a3140" gracePeriod=600 Jan 21 15:40:25 crc kubenswrapper[4773]: I0121 15:40:25.928167 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-h29mj"] Jan 21 15:40:25 crc kubenswrapper[4773]: I0121 15:40:25.929433 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h29mj" Jan 21 15:40:25 crc kubenswrapper[4773]: I0121 15:40:25.931918 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-598kj" Jan 21 15:40:25 crc kubenswrapper[4773]: I0121 15:40:25.931970 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 21 15:40:25 crc kubenswrapper[4773]: I0121 15:40:25.932172 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 21 15:40:25 crc kubenswrapper[4773]: I0121 15:40:25.937375 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h29mj"] Jan 21 15:40:26 crc kubenswrapper[4773]: I0121 15:40:26.053503 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgstf\" (UniqueName: \"kubernetes.io/projected/ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7-kube-api-access-jgstf\") pod \"openstack-operator-index-h29mj\" (UID: \"ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7\") " pod="openstack-operators/openstack-operator-index-h29mj" Jan 21 15:40:26 crc kubenswrapper[4773]: I0121 15:40:26.155234 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgstf\" (UniqueName: \"kubernetes.io/projected/ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7-kube-api-access-jgstf\") pod \"openstack-operator-index-h29mj\" (UID: \"ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7\") " pod="openstack-operators/openstack-operator-index-h29mj" Jan 21 15:40:26 crc kubenswrapper[4773]: I0121 15:40:26.172600 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgstf\" (UniqueName: \"kubernetes.io/projected/ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7-kube-api-access-jgstf\") pod \"openstack-operator-index-h29mj\" (UID: \"ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7\") " pod="openstack-operators/openstack-operator-index-h29mj" Jan 21 15:40:26 crc kubenswrapper[4773]: I0121 15:40:26.244955 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h29mj" Jan 21 15:40:26 crc kubenswrapper[4773]: W0121 15:40:26.682395 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea3248c2_85b9_45b8_ab6e_26bbafe2e0f7.slice/crio-52a621794ccef9e6e065c6ede4b0c7e92514f93b90809b8af4c6c3decb0c9049 WatchSource:0}: Error finding container 52a621794ccef9e6e065c6ede4b0c7e92514f93b90809b8af4c6c3decb0c9049: Status 404 returned error can't find the container with id 52a621794ccef9e6e065c6ede4b0c7e92514f93b90809b8af4c6c3decb0c9049 Jan 21 15:40:26 crc kubenswrapper[4773]: I0121 15:40:26.684760 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:40:26 crc kubenswrapper[4773]: I0121 15:40:26.688599 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h29mj"] Jan 21 15:40:26 crc kubenswrapper[4773]: I0121 15:40:26.937530 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h29mj" event={"ID":"ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7","Type":"ContainerStarted","Data":"52a621794ccef9e6e065c6ede4b0c7e92514f93b90809b8af4c6c3decb0c9049"} Jan 21 15:40:26 crc kubenswrapper[4773]: I0121 15:40:26.940381 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="ac9fadc09282233e8c4f18266ba6204c80ab33ee79a6058a1eff20ea540a3140" exitCode=0 Jan 21 15:40:26 crc kubenswrapper[4773]: I0121 15:40:26.940427 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"ac9fadc09282233e8c4f18266ba6204c80ab33ee79a6058a1eff20ea540a3140"} Jan 21 15:40:26 crc kubenswrapper[4773]: I0121 15:40:26.940471 4773 scope.go:117] "RemoveContainer" containerID="dfc6b9d6e0ca76822fffd10744463be0e4910ef4f750ae2e679a88777ee02328" Jan 21 15:40:28 crc kubenswrapper[4773]: I0121 15:40:28.709212 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-h29mj"] Jan 21 15:40:29 crc kubenswrapper[4773]: I0121 15:40:29.312754 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tzlwr"] Jan 21 15:40:29 crc kubenswrapper[4773]: I0121 15:40:29.314149 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tzlwr" Jan 21 15:40:29 crc kubenswrapper[4773]: I0121 15:40:29.320962 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tzlwr"] Jan 21 15:40:29 crc kubenswrapper[4773]: I0121 15:40:29.401794 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r26hr\" (UniqueName: \"kubernetes.io/projected/219ae24e-95b5-4a93-b89b-335ef51b2166-kube-api-access-r26hr\") pod \"openstack-operator-index-tzlwr\" (UID: \"219ae24e-95b5-4a93-b89b-335ef51b2166\") " pod="openstack-operators/openstack-operator-index-tzlwr" Jan 21 15:40:29 crc kubenswrapper[4773]: I0121 15:40:29.503495 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r26hr\" (UniqueName: \"kubernetes.io/projected/219ae24e-95b5-4a93-b89b-335ef51b2166-kube-api-access-r26hr\") pod \"openstack-operator-index-tzlwr\" (UID: \"219ae24e-95b5-4a93-b89b-335ef51b2166\") " pod="openstack-operators/openstack-operator-index-tzlwr" Jan 21 15:40:29 crc kubenswrapper[4773]: I0121 15:40:29.523108 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r26hr\" (UniqueName: \"kubernetes.io/projected/219ae24e-95b5-4a93-b89b-335ef51b2166-kube-api-access-r26hr\") pod \"openstack-operator-index-tzlwr\" (UID: \"219ae24e-95b5-4a93-b89b-335ef51b2166\") " pod="openstack-operators/openstack-operator-index-tzlwr" Jan 21 15:40:29 crc kubenswrapper[4773]: I0121 15:40:29.639294 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tzlwr" Jan 21 15:40:29 crc kubenswrapper[4773]: I0121 15:40:29.970597 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"d0aa21e0cf3e6fccf1e5cd944cd86f7a3dd434dbe323f714f139c45999c5ca44"} Jan 21 15:40:30 crc kubenswrapper[4773]: I0121 15:40:30.051036 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tzlwr"] Jan 21 15:40:30 crc kubenswrapper[4773]: W0121 15:40:30.061382 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod219ae24e_95b5_4a93_b89b_335ef51b2166.slice/crio-f098a8137600b8db678cea1607b402e6a553e9256343340d87446a930b87dcb3 WatchSource:0}: Error finding container f098a8137600b8db678cea1607b402e6a553e9256343340d87446a930b87dcb3: Status 404 returned error can't find the container with id f098a8137600b8db678cea1607b402e6a553e9256343340d87446a930b87dcb3 Jan 21 15:40:30 crc kubenswrapper[4773]: I0121 15:40:30.986930 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tzlwr" event={"ID":"219ae24e-95b5-4a93-b89b-335ef51b2166","Type":"ContainerStarted","Data":"f098a8137600b8db678cea1607b402e6a553e9256343340d87446a930b87dcb3"} Jan 21 15:40:31 crc kubenswrapper[4773]: I0121 15:40:31.531378 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-dtvhb" Jan 21 15:40:33 crc kubenswrapper[4773]: I0121 15:40:32.999903 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h29mj" event={"ID":"ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7","Type":"ContainerStarted","Data":"3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31"} Jan 21 15:40:33 crc kubenswrapper[4773]: I0121 15:40:33.000030 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-h29mj" podUID="ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7" containerName="registry-server" containerID="cri-o://3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31" gracePeriod=2 Jan 21 15:40:33 crc kubenswrapper[4773]: I0121 15:40:33.001738 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tzlwr" event={"ID":"219ae24e-95b5-4a93-b89b-335ef51b2166","Type":"ContainerStarted","Data":"6973aa251e7646fafacb5d7a4632dc34b37e61bfc291a286a99f1b87c4708d9d"} Jan 21 15:40:33 crc kubenswrapper[4773]: I0121 15:40:33.013437 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-h29mj" podStartSLOduration=2.8157605119999998 podStartE2EDuration="8.013421392s" podCreationTimestamp="2026-01-21 15:40:25 +0000 UTC" firstStartedPulling="2026-01-21 15:40:26.684540219 +0000 UTC m=+991.609029841" lastFinishedPulling="2026-01-21 15:40:31.882201099 +0000 UTC m=+996.806690721" observedRunningTime="2026-01-21 15:40:33.012948798 +0000 UTC m=+997.937438410" watchObservedRunningTime="2026-01-21 15:40:33.013421392 +0000 UTC m=+997.937911014" Jan 21 15:40:33 crc kubenswrapper[4773]: I0121 15:40:33.032566 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tzlwr" podStartSLOduration=2.222893497 podStartE2EDuration="4.032539515s" podCreationTimestamp="2026-01-21 15:40:29 +0000 UTC" firstStartedPulling="2026-01-21 15:40:30.06413957 +0000 UTC m=+994.988629192" lastFinishedPulling="2026-01-21 15:40:31.873785588 +0000 UTC m=+996.798275210" observedRunningTime="2026-01-21 15:40:33.026188641 +0000 UTC m=+997.950678273" watchObservedRunningTime="2026-01-21 15:40:33.032539515 +0000 UTC m=+997.957029157" Jan 21 15:40:33 crc kubenswrapper[4773]: I0121 15:40:33.527964 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h29mj" Jan 21 15:40:33 crc kubenswrapper[4773]: I0121 15:40:33.568041 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgstf\" (UniqueName: \"kubernetes.io/projected/ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7-kube-api-access-jgstf\") pod \"ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7\" (UID: \"ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7\") " Jan 21 15:40:33 crc kubenswrapper[4773]: I0121 15:40:33.578977 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7-kube-api-access-jgstf" (OuterVolumeSpecName: "kube-api-access-jgstf") pod "ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7" (UID: "ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7"). InnerVolumeSpecName "kube-api-access-jgstf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:33 crc kubenswrapper[4773]: I0121 15:40:33.669466 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgstf\" (UniqueName: \"kubernetes.io/projected/ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7-kube-api-access-jgstf\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:34 crc kubenswrapper[4773]: I0121 15:40:34.009620 4773 generic.go:334] "Generic (PLEG): container finished" podID="ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7" containerID="3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31" exitCode=0 Jan 21 15:40:34 crc kubenswrapper[4773]: I0121 15:40:34.009667 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h29mj" event={"ID":"ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7","Type":"ContainerDied","Data":"3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31"} Jan 21 15:40:34 crc kubenswrapper[4773]: I0121 15:40:34.009767 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h29mj" event={"ID":"ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7","Type":"ContainerDied","Data":"52a621794ccef9e6e065c6ede4b0c7e92514f93b90809b8af4c6c3decb0c9049"} Jan 21 15:40:34 crc kubenswrapper[4773]: I0121 15:40:34.009785 4773 scope.go:117] "RemoveContainer" containerID="3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31" Jan 21 15:40:34 crc kubenswrapper[4773]: I0121 15:40:34.009715 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h29mj" Jan 21 15:40:34 crc kubenswrapper[4773]: I0121 15:40:34.031665 4773 scope.go:117] "RemoveContainer" containerID="3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31" Jan 21 15:40:34 crc kubenswrapper[4773]: E0121 15:40:34.032151 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31\": container with ID starting with 3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31 not found: ID does not exist" containerID="3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31" Jan 21 15:40:34 crc kubenswrapper[4773]: I0121 15:40:34.032200 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31"} err="failed to get container status \"3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31\": rpc error: code = NotFound desc = could not find container \"3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31\": container with ID starting with 3e0fc864fcdd8b1b709b03101a7ccb3b4a46d94d3511d4119857f9fc61aaeb31 not found: ID does not exist" Jan 21 15:40:34 crc kubenswrapper[4773]: I0121 15:40:34.047651 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-h29mj"] Jan 21 15:40:34 crc kubenswrapper[4773]: I0121 15:40:34.051895 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-h29mj"] Jan 21 15:40:35 crc kubenswrapper[4773]: I0121 15:40:35.392891 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7" path="/var/lib/kubelet/pods/ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7/volumes" Jan 21 15:40:39 crc kubenswrapper[4773]: I0121 15:40:39.641809 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-tzlwr" Jan 21 15:40:39 crc kubenswrapper[4773]: I0121 15:40:39.644001 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tzlwr" Jan 21 15:40:39 crc kubenswrapper[4773]: I0121 15:40:39.671952 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-tzlwr" Jan 21 15:40:40 crc kubenswrapper[4773]: I0121 15:40:40.082533 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-tzlwr" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.659827 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5"] Jan 21 15:40:45 crc kubenswrapper[4773]: E0121 15:40:45.660368 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7" containerName="registry-server" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.660379 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7" containerName="registry-server" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.660496 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3248c2-85b9-45b8-ab6e-26bbafe2e0f7" containerName="registry-server" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.661392 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.663320 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5vtkq" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.665065 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5"] Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.727898 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-bundle\") pod \"2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.728144 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-util\") pod \"2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.728372 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm57n\" (UniqueName: \"kubernetes.io/projected/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-kube-api-access-wm57n\") pod \"2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.829590 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm57n\" (UniqueName: \"kubernetes.io/projected/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-kube-api-access-wm57n\") pod \"2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.829638 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-bundle\") pod \"2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.829735 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-util\") pod \"2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.830205 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-util\") pod \"2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.830268 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-bundle\") pod \"2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.847986 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm57n\" (UniqueName: \"kubernetes.io/projected/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-kube-api-access-wm57n\") pod \"2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:45 crc kubenswrapper[4773]: I0121 15:40:45.976764 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:46 crc kubenswrapper[4773]: I0121 15:40:46.396774 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5"] Jan 21 15:40:47 crc kubenswrapper[4773]: I0121 15:40:47.094558 4773 generic.go:334] "Generic (PLEG): container finished" podID="a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" containerID="497988dc1a8417a34c77f41252d7a1e491c3fcf874a55089e5a7acfbb26d70f5" exitCode=0 Jan 21 15:40:47 crc kubenswrapper[4773]: I0121 15:40:47.094637 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" event={"ID":"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44","Type":"ContainerDied","Data":"497988dc1a8417a34c77f41252d7a1e491c3fcf874a55089e5a7acfbb26d70f5"} Jan 21 15:40:47 crc kubenswrapper[4773]: I0121 15:40:47.096743 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" event={"ID":"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44","Type":"ContainerStarted","Data":"ae36e87f55347376edb3ddb2e1d58ae006135e7d91e13bec959124e13d85fae9"} Jan 21 15:40:55 crc kubenswrapper[4773]: I0121 15:40:55.156023 4773 generic.go:334] "Generic (PLEG): container finished" podID="a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" containerID="2095a3f2f15c06b6a836b8f7d9d0a9106513672d09686225a87847b819691af9" exitCode=0 Jan 21 15:40:55 crc kubenswrapper[4773]: I0121 15:40:55.156070 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" event={"ID":"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44","Type":"ContainerDied","Data":"2095a3f2f15c06b6a836b8f7d9d0a9106513672d09686225a87847b819691af9"} Jan 21 15:40:56 crc kubenswrapper[4773]: I0121 15:40:56.164460 4773 generic.go:334] "Generic (PLEG): container finished" podID="a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" containerID="a8b7bef2ad92b186385cd36c205482da930f224decf81367ab06cdab4547cda8" exitCode=0 Jan 21 15:40:56 crc kubenswrapper[4773]: I0121 15:40:56.164564 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" event={"ID":"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44","Type":"ContainerDied","Data":"a8b7bef2ad92b186385cd36c205482da930f224decf81367ab06cdab4547cda8"} Jan 21 15:40:57 crc kubenswrapper[4773]: I0121 15:40:57.433713 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:40:57 crc kubenswrapper[4773]: I0121 15:40:57.497690 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm57n\" (UniqueName: \"kubernetes.io/projected/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-kube-api-access-wm57n\") pod \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " Jan 21 15:40:57 crc kubenswrapper[4773]: I0121 15:40:57.497794 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-bundle\") pod \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " Jan 21 15:40:57 crc kubenswrapper[4773]: I0121 15:40:57.497843 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-util\") pod \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\" (UID: \"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44\") " Jan 21 15:40:57 crc kubenswrapper[4773]: I0121 15:40:57.500471 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-bundle" (OuterVolumeSpecName: "bundle") pod "a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" (UID: "a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:40:57 crc kubenswrapper[4773]: I0121 15:40:57.511474 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-util" (OuterVolumeSpecName: "util") pod "a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" (UID: "a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:40:57 crc kubenswrapper[4773]: I0121 15:40:57.511837 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-kube-api-access-wm57n" (OuterVolumeSpecName: "kube-api-access-wm57n") pod "a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" (UID: "a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44"). InnerVolumeSpecName "kube-api-access-wm57n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:40:57 crc kubenswrapper[4773]: I0121 15:40:57.599167 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm57n\" (UniqueName: \"kubernetes.io/projected/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-kube-api-access-wm57n\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:57 crc kubenswrapper[4773]: I0121 15:40:57.599201 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:57 crc kubenswrapper[4773]: I0121 15:40:57.599254 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44-util\") on node \"crc\" DevicePath \"\"" Jan 21 15:40:58 crc kubenswrapper[4773]: I0121 15:40:58.178590 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" event={"ID":"a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44","Type":"ContainerDied","Data":"ae36e87f55347376edb3ddb2e1d58ae006135e7d91e13bec959124e13d85fae9"} Jan 21 15:40:58 crc kubenswrapper[4773]: I0121 15:40:58.178634 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae36e87f55347376edb3ddb2e1d58ae006135e7d91e13bec959124e13d85fae9" Jan 21 15:40:58 crc kubenswrapper[4773]: I0121 15:40:58.178643 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.043494 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc"] Jan 21 15:41:03 crc kubenswrapper[4773]: E0121 15:41:03.044324 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" containerName="pull" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.044340 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" containerName="pull" Jan 21 15:41:03 crc kubenswrapper[4773]: E0121 15:41:03.044351 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" containerName="util" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.044358 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" containerName="util" Jan 21 15:41:03 crc kubenswrapper[4773]: E0121 15:41:03.044369 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" containerName="extract" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.044377 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" containerName="extract" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.044512 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44" containerName="extract" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.045119 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.046988 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-hq2hp" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.070737 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc"] Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.070911 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84b89\" (UniqueName: \"kubernetes.io/projected/baf015b3-f5b5-4467-8469-bccd49ba94ae-kube-api-access-84b89\") pod \"openstack-operator-controller-init-57bcf57cd7-v96fc\" (UID: \"baf015b3-f5b5-4467-8469-bccd49ba94ae\") " pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.172863 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84b89\" (UniqueName: \"kubernetes.io/projected/baf015b3-f5b5-4467-8469-bccd49ba94ae-kube-api-access-84b89\") pod \"openstack-operator-controller-init-57bcf57cd7-v96fc\" (UID: \"baf015b3-f5b5-4467-8469-bccd49ba94ae\") " pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.208376 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84b89\" (UniqueName: \"kubernetes.io/projected/baf015b3-f5b5-4467-8469-bccd49ba94ae-kube-api-access-84b89\") pod \"openstack-operator-controller-init-57bcf57cd7-v96fc\" (UID: \"baf015b3-f5b5-4467-8469-bccd49ba94ae\") " pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.366193 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" Jan 21 15:41:03 crc kubenswrapper[4773]: I0121 15:41:03.598073 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc"] Jan 21 15:41:04 crc kubenswrapper[4773]: I0121 15:41:04.272510 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" event={"ID":"baf015b3-f5b5-4467-8469-bccd49ba94ae","Type":"ContainerStarted","Data":"e706533c5588071cb2b35b7eee870fed93f7afcac4cdeb499b8c7ab253f9f35d"} Jan 21 15:41:09 crc kubenswrapper[4773]: I0121 15:41:09.309575 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" event={"ID":"baf015b3-f5b5-4467-8469-bccd49ba94ae","Type":"ContainerStarted","Data":"9d84d6a45260cc040a48bc617674e09a8660a75757925e87a582c65e8ed90cf9"} Jan 21 15:41:09 crc kubenswrapper[4773]: I0121 15:41:09.309964 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" Jan 21 15:41:09 crc kubenswrapper[4773]: I0121 15:41:09.339963 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" podStartSLOduration=1.317640166 podStartE2EDuration="6.339942638s" podCreationTimestamp="2026-01-21 15:41:03 +0000 UTC" firstStartedPulling="2026-01-21 15:41:03.609428044 +0000 UTC m=+1028.533917666" lastFinishedPulling="2026-01-21 15:41:08.631730516 +0000 UTC m=+1033.556220138" observedRunningTime="2026-01-21 15:41:09.334901879 +0000 UTC m=+1034.259391511" watchObservedRunningTime="2026-01-21 15:41:09.339942638 +0000 UTC m=+1034.264432260" Jan 21 15:41:13 crc kubenswrapper[4773]: I0121 15:41:13.369836 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.312171 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.314319 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.319502 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5xnkz" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.325961 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.330580 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.331325 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.333394 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-g5h67" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.342138 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.351762 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.352593 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.354638 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-fgpgr" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.364574 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.372100 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j6rd\" (UniqueName: \"kubernetes.io/projected/516d7adc-2317-406b-92ef-6ed5a74a74b3-kube-api-access-4j6rd\") pod \"designate-operator-controller-manager-9f958b845-s7bsl\" (UID: \"516d7adc-2317-406b-92ef-6ed5a74a74b3\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.372316 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbswf\" (UniqueName: \"kubernetes.io/projected/8678664b-38d8-4482-ae3d-fa1a74a709fd-kube-api-access-mbswf\") pod \"cinder-operator-controller-manager-9b68f5989-lvnp8\" (UID: \"8678664b-38d8-4482-ae3d-fa1a74a709fd\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.372408 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwcxq\" (UniqueName: \"kubernetes.io/projected/330099a5-d43b-482e-a4cb-e6c3bb2c6706-kube-api-access-rwcxq\") pod \"barbican-operator-controller-manager-7ddb5c749-h8g4t\" (UID: \"330099a5-d43b-482e-a4cb-e6c3bb2c6706\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.388523 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.389338 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.401604 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-v2gjw" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.429440 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.437761 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.438805 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.443165 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-4qdhw" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.447032 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.457056 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.457977 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.461012 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mgjfr" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.473744 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j6rd\" (UniqueName: \"kubernetes.io/projected/516d7adc-2317-406b-92ef-6ed5a74a74b3-kube-api-access-4j6rd\") pod \"designate-operator-controller-manager-9f958b845-s7bsl\" (UID: \"516d7adc-2317-406b-92ef-6ed5a74a74b3\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.473836 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngzn5\" (UniqueName: \"kubernetes.io/projected/8a54ffaf-3268-4696-952e-ee6381310628-kube-api-access-ngzn5\") pod \"heat-operator-controller-manager-594c8c9d5d-hpqt5\" (UID: \"8a54ffaf-3268-4696-952e-ee6381310628\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.473877 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbswf\" (UniqueName: \"kubernetes.io/projected/8678664b-38d8-4482-ae3d-fa1a74a709fd-kube-api-access-mbswf\") pod \"cinder-operator-controller-manager-9b68f5989-lvnp8\" (UID: \"8678664b-38d8-4482-ae3d-fa1a74a709fd\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.473926 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwcxq\" (UniqueName: \"kubernetes.io/projected/330099a5-d43b-482e-a4cb-e6c3bb2c6706-kube-api-access-rwcxq\") pod \"barbican-operator-controller-manager-7ddb5c749-h8g4t\" (UID: \"330099a5-d43b-482e-a4cb-e6c3bb2c6706\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.473998 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxx8l\" (UniqueName: \"kubernetes.io/projected/ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c-kube-api-access-bxx8l\") pod \"glance-operator-controller-manager-c6994669c-7kgnl\" (UID: \"ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.474027 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrpnh\" (UniqueName: \"kubernetes.io/projected/50f1e60f-1194-428b-b7e2-ccf0ebb384c7-kube-api-access-nrpnh\") pod \"horizon-operator-controller-manager-77d5c5b54f-wrww2\" (UID: \"50f1e60f-1194-428b-b7e2-ccf0ebb384c7\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.482155 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.483038 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.492613 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zp7dj" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.492662 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.492707 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.498369 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.505329 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.506262 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.509746 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-fnj6g" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.521949 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbswf\" (UniqueName: \"kubernetes.io/projected/8678664b-38d8-4482-ae3d-fa1a74a709fd-kube-api-access-mbswf\") pod \"cinder-operator-controller-manager-9b68f5989-lvnp8\" (UID: \"8678664b-38d8-4482-ae3d-fa1a74a709fd\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.522471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwcxq\" (UniqueName: \"kubernetes.io/projected/330099a5-d43b-482e-a4cb-e6c3bb2c6706-kube-api-access-rwcxq\") pod \"barbican-operator-controller-manager-7ddb5c749-h8g4t\" (UID: \"330099a5-d43b-482e-a4cb-e6c3bb2c6706\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.529207 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.530213 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.532244 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-ddlkk" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.548933 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j6rd\" (UniqueName: \"kubernetes.io/projected/516d7adc-2317-406b-92ef-6ed5a74a74b3-kube-api-access-4j6rd\") pod \"designate-operator-controller-manager-9f958b845-s7bsl\" (UID: \"516d7adc-2317-406b-92ef-6ed5a74a74b3\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.554798 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.576438 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxx8l\" (UniqueName: \"kubernetes.io/projected/ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c-kube-api-access-bxx8l\") pod \"glance-operator-controller-manager-c6994669c-7kgnl\" (UID: \"ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.576707 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrpnh\" (UniqueName: \"kubernetes.io/projected/50f1e60f-1194-428b-b7e2-ccf0ebb384c7-kube-api-access-nrpnh\") pod \"horizon-operator-controller-manager-77d5c5b54f-wrww2\" (UID: \"50f1e60f-1194-428b-b7e2-ccf0ebb384c7\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.576751 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrw4z\" (UniqueName: \"kubernetes.io/projected/d60c449c-a583-4b8e-8265-9df068220041-kube-api-access-mrw4z\") pod \"keystone-operator-controller-manager-767fdc4f47-fcbjv\" (UID: \"d60c449c-a583-4b8e-8265-9df068220041\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.576773 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.576806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jng6w\" (UniqueName: \"kubernetes.io/projected/fdfe2fce-12c1-4026-b40f-77234a609986-kube-api-access-jng6w\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.576836 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngzn5\" (UniqueName: \"kubernetes.io/projected/8a54ffaf-3268-4696-952e-ee6381310628-kube-api-access-ngzn5\") pod \"heat-operator-controller-manager-594c8c9d5d-hpqt5\" (UID: \"8a54ffaf-3268-4696-952e-ee6381310628\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.576866 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x9qv\" (UniqueName: \"kubernetes.io/projected/32f1de73-4ee0-4eda-8709-d1642d8452f2-kube-api-access-8x9qv\") pod \"ironic-operator-controller-manager-78757b4889-c79x5\" (UID: \"32f1de73-4ee0-4eda-8709-d1642d8452f2\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.586829 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.594197 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.601060 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.623543 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrpnh\" (UniqueName: \"kubernetes.io/projected/50f1e60f-1194-428b-b7e2-ccf0ebb384c7-kube-api-access-nrpnh\") pod \"horizon-operator-controller-manager-77d5c5b54f-wrww2\" (UID: \"50f1e60f-1194-428b-b7e2-ccf0ebb384c7\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.623883 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-q59w5" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.632408 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngzn5\" (UniqueName: \"kubernetes.io/projected/8a54ffaf-3268-4696-952e-ee6381310628-kube-api-access-ngzn5\") pod \"heat-operator-controller-manager-594c8c9d5d-hpqt5\" (UID: \"8a54ffaf-3268-4696-952e-ee6381310628\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.632854 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.633247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxx8l\" (UniqueName: \"kubernetes.io/projected/ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c-kube-api-access-bxx8l\") pod \"glance-operator-controller-manager-c6994669c-7kgnl\" (UID: \"ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.647222 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.668281 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.669417 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.670630 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.677337 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkhk9\" (UniqueName: \"kubernetes.io/projected/2264dd36-5855-49cc-bf31-1d1e9dcb1f9f-kube-api-access-bkhk9\") pod \"manila-operator-controller-manager-864f6b75bf-xdcfb\" (UID: \"2264dd36-5855-49cc-bf31-1d1e9dcb1f9f\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.677480 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x9qv\" (UniqueName: \"kubernetes.io/projected/32f1de73-4ee0-4eda-8709-d1642d8452f2-kube-api-access-8x9qv\") pod \"ironic-operator-controller-manager-78757b4889-c79x5\" (UID: \"32f1de73-4ee0-4eda-8709-d1642d8452f2\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.677590 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrw4z\" (UniqueName: \"kubernetes.io/projected/d60c449c-a583-4b8e-8265-9df068220041-kube-api-access-mrw4z\") pod \"keystone-operator-controller-manager-767fdc4f47-fcbjv\" (UID: \"d60c449c-a583-4b8e-8265-9df068220041\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.677672 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.677807 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lfcd\" (UniqueName: \"kubernetes.io/projected/43aaba2b-296a-407d-9ea2-bbf4c05e868e-kube-api-access-2lfcd\") pod \"mariadb-operator-controller-manager-c87fff755-jz6m4\" (UID: \"43aaba2b-296a-407d-9ea2-bbf4c05e868e\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.677896 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jng6w\" (UniqueName: \"kubernetes.io/projected/fdfe2fce-12c1-4026-b40f-77234a609986-kube-api-access-jng6w\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:41:48 crc kubenswrapper[4773]: E0121 15:41:48.678643 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:41:48 crc kubenswrapper[4773]: E0121 15:41:48.678765 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert podName:fdfe2fce-12c1-4026-b40f-77234a609986 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:49.178749763 +0000 UTC m=+1074.103239385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert") pod "infra-operator-controller-manager-77c48c7859-pldbp" (UID: "fdfe2fce-12c1-4026-b40f-77234a609986") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.679679 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.688803 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.694253 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-m4kgg" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.706974 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.710408 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.717940 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.723864 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-tlbnr" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.736443 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x9qv\" (UniqueName: \"kubernetes.io/projected/32f1de73-4ee0-4eda-8709-d1642d8452f2-kube-api-access-8x9qv\") pod \"ironic-operator-controller-manager-78757b4889-c79x5\" (UID: \"32f1de73-4ee0-4eda-8709-d1642d8452f2\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.761910 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.762482 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrw4z\" (UniqueName: \"kubernetes.io/projected/d60c449c-a583-4b8e-8265-9df068220041-kube-api-access-mrw4z\") pod \"keystone-operator-controller-manager-767fdc4f47-fcbjv\" (UID: \"d60c449c-a583-4b8e-8265-9df068220041\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.770251 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.770492 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.779909 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n58f9\" (UniqueName: \"kubernetes.io/projected/b7286f1c-434c-4ebb-9d2a-54a6596a63b5-kube-api-access-n58f9\") pod \"nova-operator-controller-manager-65849867d6-nwvmf\" (UID: \"b7286f1c-434c-4ebb-9d2a-54a6596a63b5\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.779985 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5kxh\" (UniqueName: \"kubernetes.io/projected/9a48a802-404e-4a60-821b-8b91a4830da8-kube-api-access-w5kxh\") pod \"neutron-operator-controller-manager-cb4666565-d9k6n\" (UID: \"9a48a802-404e-4a60-821b-8b91a4830da8\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.780011 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lfcd\" (UniqueName: \"kubernetes.io/projected/43aaba2b-296a-407d-9ea2-bbf4c05e868e-kube-api-access-2lfcd\") pod \"mariadb-operator-controller-manager-c87fff755-jz6m4\" (UID: \"43aaba2b-296a-407d-9ea2-bbf4c05e868e\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.780051 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkhk9\" (UniqueName: \"kubernetes.io/projected/2264dd36-5855-49cc-bf31-1d1e9dcb1f9f-kube-api-access-bkhk9\") pod \"manila-operator-controller-manager-864f6b75bf-xdcfb\" (UID: \"2264dd36-5855-49cc-bf31-1d1e9dcb1f9f\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.786028 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.786303 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-sfbwb" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.800247 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.800290 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.801095 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.801758 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.805921 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tvrkj" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.813090 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.863586 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.866443 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.873597 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-69fkq" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.878315 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.879742 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.880543 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n58f9\" (UniqueName: \"kubernetes.io/projected/b7286f1c-434c-4ebb-9d2a-54a6596a63b5-kube-api-access-n58f9\") pod \"nova-operator-controller-manager-65849867d6-nwvmf\" (UID: \"b7286f1c-434c-4ebb-9d2a-54a6596a63b5\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.880575 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6psw\" (UniqueName: \"kubernetes.io/projected/d8cb173b-7eaa-4183-8028-0a1c4730097c-kube-api-access-g6psw\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz\" (UID: \"d8cb173b-7eaa-4183-8028-0a1c4730097c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.880606 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz\" (UID: \"d8cb173b-7eaa-4183-8028-0a1c4730097c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.880635 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp27z\" (UniqueName: \"kubernetes.io/projected/0f906590-d519-4724-bc67-05c6b3a9191d-kube-api-access-mp27z\") pod \"ovn-operator-controller-manager-55db956ddc-cn7gp\" (UID: \"0f906590-d519-4724-bc67-05c6b3a9191d\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.880665 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5kxh\" (UniqueName: \"kubernetes.io/projected/9a48a802-404e-4a60-821b-8b91a4830da8-kube-api-access-w5kxh\") pod \"neutron-operator-controller-manager-cb4666565-d9k6n\" (UID: \"9a48a802-404e-4a60-821b-8b91a4830da8\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.880688 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xwpq\" (UniqueName: \"kubernetes.io/projected/ddac57c3-b102-4cfc-8b1e-53de342cef39-kube-api-access-2xwpq\") pod \"octavia-operator-controller-manager-7fc9b76cf6-kn644\" (UID: \"ddac57c3-b102-4cfc-8b1e-53de342cef39\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.881799 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gdm7s" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.882931 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.886954 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.893197 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkhk9\" (UniqueName: \"kubernetes.io/projected/2264dd36-5855-49cc-bf31-1d1e9dcb1f9f-kube-api-access-bkhk9\") pod \"manila-operator-controller-manager-864f6b75bf-xdcfb\" (UID: \"2264dd36-5855-49cc-bf31-1d1e9dcb1f9f\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.898340 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.900171 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.921479 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5kxh\" (UniqueName: \"kubernetes.io/projected/9a48a802-404e-4a60-821b-8b91a4830da8-kube-api-access-w5kxh\") pod \"neutron-operator-controller-manager-cb4666565-d9k6n\" (UID: \"9a48a802-404e-4a60-821b-8b91a4830da8\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.921546 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.922326 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lfcd\" (UniqueName: \"kubernetes.io/projected/43aaba2b-296a-407d-9ea2-bbf4c05e868e-kube-api-access-2lfcd\") pod \"mariadb-operator-controller-manager-c87fff755-jz6m4\" (UID: \"43aaba2b-296a-407d-9ea2-bbf4c05e868e\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.927125 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jng6w\" (UniqueName: \"kubernetes.io/projected/fdfe2fce-12c1-4026-b40f-77234a609986-kube-api-access-jng6w\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.933946 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.935172 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.945401 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-vvfdl" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.963213 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26"] Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.981283 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n58f9\" (UniqueName: \"kubernetes.io/projected/b7286f1c-434c-4ebb-9d2a-54a6596a63b5-kube-api-access-n58f9\") pod \"nova-operator-controller-manager-65849867d6-nwvmf\" (UID: \"b7286f1c-434c-4ebb-9d2a-54a6596a63b5\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.981562 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6psw\" (UniqueName: \"kubernetes.io/projected/d8cb173b-7eaa-4183-8028-0a1c4730097c-kube-api-access-g6psw\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz\" (UID: \"d8cb173b-7eaa-4183-8028-0a1c4730097c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.981613 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz\" (UID: \"d8cb173b-7eaa-4183-8028-0a1c4730097c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.981657 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp27z\" (UniqueName: \"kubernetes.io/projected/0f906590-d519-4724-bc67-05c6b3a9191d-kube-api-access-mp27z\") pod \"ovn-operator-controller-manager-55db956ddc-cn7gp\" (UID: \"0f906590-d519-4724-bc67-05c6b3a9191d\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.981732 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xwpq\" (UniqueName: \"kubernetes.io/projected/ddac57c3-b102-4cfc-8b1e-53de342cef39-kube-api-access-2xwpq\") pod \"octavia-operator-controller-manager-7fc9b76cf6-kn644\" (UID: \"ddac57c3-b102-4cfc-8b1e-53de342cef39\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" Jan 21 15:41:48 crc kubenswrapper[4773]: E0121 15:41:48.981999 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:41:48 crc kubenswrapper[4773]: E0121 15:41:48.982079 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert podName:d8cb173b-7eaa-4183-8028-0a1c4730097c nodeName:}" failed. No retries permitted until 2026-01-21 15:41:49.482062671 +0000 UTC m=+1074.406552293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" (UID: "d8cb173b-7eaa-4183-8028-0a1c4730097c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:41:48 crc kubenswrapper[4773]: I0121 15:41:48.986079 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.006777 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.007566 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.014100 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp27z\" (UniqueName: \"kubernetes.io/projected/0f906590-d519-4724-bc67-05c6b3a9191d-kube-api-access-mp27z\") pod \"ovn-operator-controller-manager-55db956ddc-cn7gp\" (UID: \"0f906590-d519-4724-bc67-05c6b3a9191d\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.014455 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.015795 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rpwh7" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.018627 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.019793 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.024077 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jwhkj" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.029273 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.043056 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.048633 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.049792 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.059089 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xtjks" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.059315 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xwpq\" (UniqueName: \"kubernetes.io/projected/ddac57c3-b102-4cfc-8b1e-53de342cef39-kube-api-access-2xwpq\") pod \"octavia-operator-controller-manager-7fc9b76cf6-kn644\" (UID: \"ddac57c3-b102-4cfc-8b1e-53de342cef39\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.060206 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6psw\" (UniqueName: \"kubernetes.io/projected/d8cb173b-7eaa-4183-8028-0a1c4730097c-kube-api-access-g6psw\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz\" (UID: \"d8cb173b-7eaa-4183-8028-0a1c4730097c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.065760 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.080898 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.084748 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.086070 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbx92\" (UniqueName: \"kubernetes.io/projected/eeb5c272-4544-47a4-8d08-187872fea7bd-kube-api-access-cbx92\") pod \"test-operator-controller-manager-7cd8bc9dbb-94knv\" (UID: \"eeb5c272-4544-47a4-8d08-187872fea7bd\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.086131 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9cnw\" (UniqueName: \"kubernetes.io/projected/7ab140ad-f64b-45e3-a393-f66567e98a9f-kube-api-access-p9cnw\") pod \"swift-operator-controller-manager-85dd56d4cc-xs55j\" (UID: \"7ab140ad-f64b-45e3-a393-f66567e98a9f\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.086175 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzmnn\" (UniqueName: \"kubernetes.io/projected/6411a5d3-7b7b-4735-b01c-7c4aa0d5509c-kube-api-access-zzmnn\") pod \"placement-operator-controller-manager-686df47fcb-2bt26\" (UID: \"6411a5d3-7b7b-4735-b01c-7c4aa0d5509c\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.086194 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb58k\" (UniqueName: \"kubernetes.io/projected/4a9d0079-9636-4913-95fd-305e8d54280d-kube-api-access-qb58k\") pod \"telemetry-operator-controller-manager-5c4ff57dc8-78tss\" (UID: \"4a9d0079-9636-4913-95fd-305e8d54280d\") " pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.130388 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.131637 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.135378 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-9zxtx" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.157124 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.182419 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.187406 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbx92\" (UniqueName: \"kubernetes.io/projected/eeb5c272-4544-47a4-8d08-187872fea7bd-kube-api-access-cbx92\") pod \"test-operator-controller-manager-7cd8bc9dbb-94knv\" (UID: \"eeb5c272-4544-47a4-8d08-187872fea7bd\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.187479 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9cnw\" (UniqueName: \"kubernetes.io/projected/7ab140ad-f64b-45e3-a393-f66567e98a9f-kube-api-access-p9cnw\") pod \"swift-operator-controller-manager-85dd56d4cc-xs55j\" (UID: \"7ab140ad-f64b-45e3-a393-f66567e98a9f\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.187514 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzmnn\" (UniqueName: \"kubernetes.io/projected/6411a5d3-7b7b-4735-b01c-7c4aa0d5509c-kube-api-access-zzmnn\") pod \"placement-operator-controller-manager-686df47fcb-2bt26\" (UID: \"6411a5d3-7b7b-4735-b01c-7c4aa0d5509c\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.187534 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb58k\" (UniqueName: \"kubernetes.io/projected/4a9d0079-9636-4913-95fd-305e8d54280d-kube-api-access-qb58k\") pod \"telemetry-operator-controller-manager-5c4ff57dc8-78tss\" (UID: \"4a9d0079-9636-4913-95fd-305e8d54280d\") " pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.187558 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b94f9\" (UniqueName: \"kubernetes.io/projected/d7b52472-5c30-471f-a937-c50d96103339-kube-api-access-b94f9\") pod \"watcher-operator-controller-manager-64cd966744-2wbz9\" (UID: \"d7b52472-5c30-471f-a937-c50d96103339\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.187605 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:41:49 crc kubenswrapper[4773]: E0121 15:41:49.187732 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:41:49 crc kubenswrapper[4773]: E0121 15:41:49.187785 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert podName:fdfe2fce-12c1-4026-b40f-77234a609986 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:50.187765272 +0000 UTC m=+1075.112254894 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert") pod "infra-operator-controller-manager-77c48c7859-pldbp" (UID: "fdfe2fce-12c1-4026-b40f-77234a609986") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.234896 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.253262 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzmnn\" (UniqueName: \"kubernetes.io/projected/6411a5d3-7b7b-4735-b01c-7c4aa0d5509c-kube-api-access-zzmnn\") pod \"placement-operator-controller-manager-686df47fcb-2bt26\" (UID: \"6411a5d3-7b7b-4735-b01c-7c4aa0d5509c\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.255525 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbx92\" (UniqueName: \"kubernetes.io/projected/eeb5c272-4544-47a4-8d08-187872fea7bd-kube-api-access-cbx92\") pod \"test-operator-controller-manager-7cd8bc9dbb-94knv\" (UID: \"eeb5c272-4544-47a4-8d08-187872fea7bd\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.265474 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb58k\" (UniqueName: \"kubernetes.io/projected/4a9d0079-9636-4913-95fd-305e8d54280d-kube-api-access-qb58k\") pod \"telemetry-operator-controller-manager-5c4ff57dc8-78tss\" (UID: \"4a9d0079-9636-4913-95fd-305e8d54280d\") " pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.270035 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9cnw\" (UniqueName: \"kubernetes.io/projected/7ab140ad-f64b-45e3-a393-f66567e98a9f-kube-api-access-p9cnw\") pod \"swift-operator-controller-manager-85dd56d4cc-xs55j\" (UID: \"7ab140ad-f64b-45e3-a393-f66567e98a9f\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.288739 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b94f9\" (UniqueName: \"kubernetes.io/projected/d7b52472-5c30-471f-a937-c50d96103339-kube-api-access-b94f9\") pod \"watcher-operator-controller-manager-64cd966744-2wbz9\" (UID: \"d7b52472-5c30-471f-a937-c50d96103339\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.311804 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.313365 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.318917 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nr48l" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.319047 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.320776 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.322606 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b94f9\" (UniqueName: \"kubernetes.io/projected/d7b52472-5c30-471f-a937-c50d96103339-kube-api-access-b94f9\") pod \"watcher-operator-controller-manager-64cd966744-2wbz9\" (UID: \"d7b52472-5c30-471f-a937-c50d96103339\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.334080 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.344236 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.355574 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.356988 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.358939 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ftvjn" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.361144 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.457380 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.482074 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.495120 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.495217 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz\" (UID: \"d8cb173b-7eaa-4183-8028-0a1c4730097c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.495311 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.495337 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7687\" (UniqueName: \"kubernetes.io/projected/83918de1-f089-46b5-99e4-b249fbe09d65-kube-api-access-n7687\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.495372 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvqhv\" (UniqueName: \"kubernetes.io/projected/02b38141-e855-4eeb-ac52-d135fb5f44f7-kube-api-access-jvqhv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dp2s7\" (UID: \"02b38141-e855-4eeb-ac52-d135fb5f44f7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7" Jan 21 15:41:49 crc kubenswrapper[4773]: E0121 15:41:49.495866 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:41:49 crc kubenswrapper[4773]: E0121 15:41:49.496099 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert podName:d8cb173b-7eaa-4183-8028-0a1c4730097c nodeName:}" failed. No retries permitted until 2026-01-21 15:41:50.496077418 +0000 UTC m=+1075.420567040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" (UID: "d8cb173b-7eaa-4183-8028-0a1c4730097c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.542341 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.554784 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.597141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.597238 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.597257 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7687\" (UniqueName: \"kubernetes.io/projected/83918de1-f089-46b5-99e4-b249fbe09d65-kube-api-access-n7687\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.597283 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvqhv\" (UniqueName: \"kubernetes.io/projected/02b38141-e855-4eeb-ac52-d135fb5f44f7-kube-api-access-jvqhv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dp2s7\" (UID: \"02b38141-e855-4eeb-ac52-d135fb5f44f7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7" Jan 21 15:41:49 crc kubenswrapper[4773]: E0121 15:41:49.597755 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 15:41:49 crc kubenswrapper[4773]: E0121 15:41:49.598949 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 15:41:49 crc kubenswrapper[4773]: E0121 15:41:49.598991 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs podName:83918de1-f089-46b5-99e4-b249fbe09d65 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:50.098975939 +0000 UTC m=+1075.023465561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs") pod "openstack-operator-controller-manager-7c6777f5bd-z474b" (UID: "83918de1-f089-46b5-99e4-b249fbe09d65") : secret "metrics-server-cert" not found Jan 21 15:41:49 crc kubenswrapper[4773]: E0121 15:41:49.599015 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs podName:83918de1-f089-46b5-99e4-b249fbe09d65 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:50.09900799 +0000 UTC m=+1075.023497612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs") pod "openstack-operator-controller-manager-7c6777f5bd-z474b" (UID: "83918de1-f089-46b5-99e4-b249fbe09d65") : secret "webhook-server-cert" not found Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.615300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7687\" (UniqueName: \"kubernetes.io/projected/83918de1-f089-46b5-99e4-b249fbe09d65-kube-api-access-n7687\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.615333 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvqhv\" (UniqueName: \"kubernetes.io/projected/02b38141-e855-4eeb-ac52-d135fb5f44f7-kube-api-access-jvqhv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dp2s7\" (UID: \"02b38141-e855-4eeb-ac52-d135fb5f44f7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.694548 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7" Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.956922 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl"] Jan 21 15:41:49 crc kubenswrapper[4773]: I0121 15:41:49.988491 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.106574 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.106651 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.106784 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.106828 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs podName:83918de1-f089-46b5-99e4-b249fbe09d65 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:51.106815826 +0000 UTC m=+1076.031305438 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs") pod "openstack-operator-controller-manager-7c6777f5bd-z474b" (UID: "83918de1-f089-46b5-99e4-b249fbe09d65") : secret "webhook-server-cert" not found Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.107036 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.107118 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs podName:83918de1-f089-46b5-99e4-b249fbe09d65 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:51.107099104 +0000 UTC m=+1076.031588726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs") pod "openstack-operator-controller-manager-7c6777f5bd-z474b" (UID: "83918de1-f089-46b5-99e4-b249fbe09d65") : secret "metrics-server-cert" not found Jan 21 15:41:50 crc kubenswrapper[4773]: W0121 15:41:50.107579 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd60c449c_a583_4b8e_8265_9df068220041.slice/crio-f3f2527e102f7371f7b34405e6231f5971ac67f809d577d65801da854fc15208 WatchSource:0}: Error finding container f3f2527e102f7371f7b34405e6231f5971ac67f809d577d65801da854fc15208: Status 404 returned error can't find the container with id f3f2527e102f7371f7b34405e6231f5971ac67f809d577d65801da854fc15208 Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.120880 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.128871 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5"] Jan 21 15:41:50 crc kubenswrapper[4773]: W0121 15:41:50.136322 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce51f4fd_fd32_4d8f_bb25_c5a7d4b4680c.slice/crio-9c11bc372c7a0380a9cbaa82953db8679842f0c760c912950a1e608b09edcf63 WatchSource:0}: Error finding container 9c11bc372c7a0380a9cbaa82953db8679842f0c760c912950a1e608b09edcf63: Status 404 returned error can't find the container with id 9c11bc372c7a0380a9cbaa82953db8679842f0c760c912950a1e608b09edcf63 Jan 21 15:41:50 crc kubenswrapper[4773]: W0121 15:41:50.137179 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod330099a5_d43b_482e_a4cb_e6c3bb2c6706.slice/crio-274ac84ac009f9493df20538bad803d5979886766e9fad2d0a613d832eaa53ec WatchSource:0}: Error finding container 274ac84ac009f9493df20538bad803d5979886766e9fad2d0a613d832eaa53ec: Status 404 returned error can't find the container with id 274ac84ac009f9493df20538bad803d5979886766e9fad2d0a613d832eaa53ec Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.143467 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.152758 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl"] Jan 21 15:41:50 crc kubenswrapper[4773]: W0121 15:41:50.157481 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32f1de73_4ee0_4eda_8709_d1642d8452f2.slice/crio-a9c6919f9b6502bfbd5289d7ad82e8d2d30683b1c91d93bbae422ad8f427ed57 WatchSource:0}: Error finding container a9c6919f9b6502bfbd5289d7ad82e8d2d30683b1c91d93bbae422ad8f427ed57: Status 404 returned error can't find the container with id a9c6919f9b6502bfbd5289d7ad82e8d2d30683b1c91d93bbae422ad8f427ed57 Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.158428 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.208051 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.208305 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.208372 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert podName:fdfe2fce-12c1-4026-b40f-77234a609986 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:52.208352661 +0000 UTC m=+1077.132842283 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert") pod "infra-operator-controller-manager-77c48c7859-pldbp" (UID: "fdfe2fce-12c1-4026-b40f-77234a609986") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.308604 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.313006 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4"] Jan 21 15:41:50 crc kubenswrapper[4773]: W0121 15:41:50.324596 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43aaba2b_296a_407d_9ea2_bbf4c05e868e.slice/crio-dc15f7f67179d09faac1666473fac4b3dcc46fd08d1b5d5c8dce56f857c3e574 WatchSource:0}: Error finding container dc15f7f67179d09faac1666473fac4b3dcc46fd08d1b5d5c8dce56f857c3e574: Status 404 returned error can't find the container with id dc15f7f67179d09faac1666473fac4b3dcc46fd08d1b5d5c8dce56f857c3e574 Jan 21 15:41:50 crc kubenswrapper[4773]: W0121 15:41:50.326066 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f906590_d519_4724_bc67_05c6b3a9191d.slice/crio-0607102645c1a543503cd35e15411394c3732e0e6e256d4004640a09ba63b050 WatchSource:0}: Error finding container 0607102645c1a543503cd35e15411394c3732e0e6e256d4004640a09ba63b050: Status 404 returned error can't find the container with id 0607102645c1a543503cd35e15411394c3732e0e6e256d4004640a09ba63b050 Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.347181 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.355394 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2"] Jan 21 15:41:50 crc kubenswrapper[4773]: W0121 15:41:50.355993 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50f1e60f_1194_428b_b7e2_ccf0ebb384c7.slice/crio-bcdfb7f34545e90076bde79e4792ead0e969189b1b0a5d8a62dbf70fb5d0b261 WatchSource:0}: Error finding container bcdfb7f34545e90076bde79e4792ead0e969189b1b0a5d8a62dbf70fb5d0b261: Status 404 returned error can't find the container with id bcdfb7f34545e90076bde79e4792ead0e969189b1b0a5d8a62dbf70fb5d0b261 Jan 21 15:41:50 crc kubenswrapper[4773]: W0121 15:41:50.362727 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2264dd36_5855_49cc_bf31_1d1e9dcb1f9f.slice/crio-b02fd5060862d6d813784fd0b333f72ca08c7907c987d3f36071ad6c2b6ed867 WatchSource:0}: Error finding container b02fd5060862d6d813784fd0b333f72ca08c7907c987d3f36071ad6c2b6ed867: Status 404 returned error can't find the container with id b02fd5060862d6d813784fd0b333f72ca08c7907c987d3f36071ad6c2b6ed867 Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.370465 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644"] Jan 21 15:41:50 crc kubenswrapper[4773]: W0121 15:41:50.392240 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddac57c3_b102_4cfc_8b1e_53de342cef39.slice/crio-1368cd5ff2a069178f08bd77054d7164b9047754416e7839d882796f18c2dd66 WatchSource:0}: Error finding container 1368cd5ff2a069178f08bd77054d7164b9047754416e7839d882796f18c2dd66: Status 404 returned error can't find the container with id 1368cd5ff2a069178f08bd77054d7164b9047754416e7839d882796f18c2dd66 Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.395600 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb"] Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.395967 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2xwpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7fc9b76cf6-kn644_openstack-operators(ddac57c3-b102-4cfc-8b1e-53de342cef39): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.398656 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" podUID="ddac57c3-b102-4cfc-8b1e-53de342cef39" Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.523545 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.524352 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz\" (UID: \"d8cb173b-7eaa-4183-8028-0a1c4730097c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.524994 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.525037 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert podName:d8cb173b-7eaa-4183-8028-0a1c4730097c nodeName:}" failed. No retries permitted until 2026-01-21 15:41:52.525024745 +0000 UTC m=+1077.449514357 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" (UID: "d8cb173b-7eaa-4183-8028-0a1c4730097c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.538641 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.545170 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.564684 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.565252 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.572016 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9"] Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.574471 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b94f9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-64cd966744-2wbz9_openstack-operators(d7b52472-5c30-471f-a937-c50d96103339): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.574617 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jvqhv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-dp2s7_openstack-operators(02b38141-e855-4eeb-ac52-d135fb5f44f7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.574763 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.179:5001/openstack-k8s-operators/telemetry-operator:eb64f15362ce8fe083224b8876330e95b4455acc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qb58k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5c4ff57dc8-78tss_openstack-operators(4a9d0079-9636-4913-95fd-305e8d54280d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.574900 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w5kxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-cb4666565-d9k6n_openstack-operators(9a48a802-404e-4a60-821b-8b91a4830da8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.575015 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zzmnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-686df47fcb-2bt26_openstack-operators(6411a5d3-7b7b-4735-b01c-7c4aa0d5509c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.575377 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p9cnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-85dd56d4cc-xs55j_openstack-operators(7ab140ad-f64b-45e3-a393-f66567e98a9f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.575934 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" podUID="d7b52472-5c30-471f-a937-c50d96103339" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.575927 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" podUID="4a9d0079-9636-4913-95fd-305e8d54280d" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.576009 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" podUID="9a48a802-404e-4a60-821b-8b91a4830da8" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.576005 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7" podUID="02b38141-e855-4eeb-ac52-d135fb5f44f7" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.576443 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" podUID="6411a5d3-7b7b-4735-b01c-7c4aa0d5509c" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.576514 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" podUID="7ab140ad-f64b-45e3-a393-f66567e98a9f" Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.578467 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7"] Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.631037 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" event={"ID":"43aaba2b-296a-407d-9ea2-bbf4c05e868e","Type":"ContainerStarted","Data":"dc15f7f67179d09faac1666473fac4b3dcc46fd08d1b5d5c8dce56f857c3e574"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.631995 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" event={"ID":"8a54ffaf-3268-4696-952e-ee6381310628","Type":"ContainerStarted","Data":"0c50f0046203ba9f1b3d1bcd605babbfac05e52574fe89e49ed16dee9b6b131d"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.633076 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" event={"ID":"6411a5d3-7b7b-4735-b01c-7c4aa0d5509c","Type":"ContainerStarted","Data":"974544d4823f0c8837bb5dd59a7a5cf2d55f915687e51711bbce2c1146c93f85"} Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.635800 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" podUID="6411a5d3-7b7b-4735-b01c-7c4aa0d5509c" Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.641147 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" event={"ID":"4a9d0079-9636-4913-95fd-305e8d54280d","Type":"ContainerStarted","Data":"065a05a70ffa9d513b33c4fd7dfba6f6c5aef9e6b1d9108a5f52b3326d857bd8"} Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.642282 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.179:5001/openstack-k8s-operators/telemetry-operator:eb64f15362ce8fe083224b8876330e95b4455acc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" podUID="4a9d0079-9636-4913-95fd-305e8d54280d" Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.642414 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" event={"ID":"516d7adc-2317-406b-92ef-6ed5a74a74b3","Type":"ContainerStarted","Data":"4d76a4a76be20a6605ad6e9397940f0f67e64dc2bd9f9237903800227a3394b2"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.643253 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" event={"ID":"0f906590-d519-4724-bc67-05c6b3a9191d","Type":"ContainerStarted","Data":"0607102645c1a543503cd35e15411394c3732e0e6e256d4004640a09ba63b050"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.644626 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" event={"ID":"b7286f1c-434c-4ebb-9d2a-54a6596a63b5","Type":"ContainerStarted","Data":"1dc5a849de0e2885b9d4fbaa98d871d66734c523d440a64abe886c46e34280c9"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.645439 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" event={"ID":"2264dd36-5855-49cc-bf31-1d1e9dcb1f9f","Type":"ContainerStarted","Data":"b02fd5060862d6d813784fd0b333f72ca08c7907c987d3f36071ad6c2b6ed867"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.650025 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" event={"ID":"9a48a802-404e-4a60-821b-8b91a4830da8","Type":"ContainerStarted","Data":"116193953fc912f2640075ed712ec6435edf2ec05aa96c1e58c9a0f8e63ffddf"} Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.651844 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" podUID="9a48a802-404e-4a60-821b-8b91a4830da8" Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.652744 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" event={"ID":"d7b52472-5c30-471f-a937-c50d96103339","Type":"ContainerStarted","Data":"d20a8370a7d5014fd34859ba50ead25cf129085a27bb30e88d6229e4de68e5da"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.654677 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" event={"ID":"8678664b-38d8-4482-ae3d-fa1a74a709fd","Type":"ContainerStarted","Data":"681f1efeb4523ee168d217b8497fa6761091bbe258b9ab646386582aa672ba89"} Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.654855 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" podUID="d7b52472-5c30-471f-a937-c50d96103339" Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.658907 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" event={"ID":"ddac57c3-b102-4cfc-8b1e-53de342cef39","Type":"ContainerStarted","Data":"1368cd5ff2a069178f08bd77054d7164b9047754416e7839d882796f18c2dd66"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.663993 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7" event={"ID":"02b38141-e855-4eeb-ac52-d135fb5f44f7","Type":"ContainerStarted","Data":"df765f5724a64a158fe8a7fc930be712de26e98e1c690e8256e9cf03a8e20066"} Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.664086 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" podUID="ddac57c3-b102-4cfc-8b1e-53de342cef39" Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.666234 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" event={"ID":"7ab140ad-f64b-45e3-a393-f66567e98a9f","Type":"ContainerStarted","Data":"1560f609e40052342c1074bee7f997534900c30983cbfaf74599f0b4fb085ef6"} Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.669013 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" podUID="7ab140ad-f64b-45e3-a393-f66567e98a9f" Jan 21 15:41:50 crc kubenswrapper[4773]: E0121 15:41:50.669104 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7" podUID="02b38141-e855-4eeb-ac52-d135fb5f44f7" Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.679145 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" event={"ID":"330099a5-d43b-482e-a4cb-e6c3bb2c6706","Type":"ContainerStarted","Data":"274ac84ac009f9493df20538bad803d5979886766e9fad2d0a613d832eaa53ec"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.686609 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" event={"ID":"d60c449c-a583-4b8e-8265-9df068220041","Type":"ContainerStarted","Data":"f3f2527e102f7371f7b34405e6231f5971ac67f809d577d65801da854fc15208"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.705260 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" event={"ID":"ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c","Type":"ContainerStarted","Data":"9c11bc372c7a0380a9cbaa82953db8679842f0c760c912950a1e608b09edcf63"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.718049 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" event={"ID":"eeb5c272-4544-47a4-8d08-187872fea7bd","Type":"ContainerStarted","Data":"d2ec1fc9b6b4dbc25e0916b996e81975f51dfd69fd62029006e8db53d96fcfb0"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.735121 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" event={"ID":"32f1de73-4ee0-4eda-8709-d1642d8452f2","Type":"ContainerStarted","Data":"a9c6919f9b6502bfbd5289d7ad82e8d2d30683b1c91d93bbae422ad8f427ed57"} Jan 21 15:41:50 crc kubenswrapper[4773]: I0121 15:41:50.748876 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" event={"ID":"50f1e60f-1194-428b-b7e2-ccf0ebb384c7","Type":"ContainerStarted","Data":"bcdfb7f34545e90076bde79e4792ead0e969189b1b0a5d8a62dbf70fb5d0b261"} Jan 21 15:41:51 crc kubenswrapper[4773]: I0121 15:41:51.137644 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:51 crc kubenswrapper[4773]: I0121 15:41:51.137786 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:51 crc kubenswrapper[4773]: E0121 15:41:51.137834 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 15:41:51 crc kubenswrapper[4773]: E0121 15:41:51.137940 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 15:41:51 crc kubenswrapper[4773]: E0121 15:41:51.137947 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs podName:83918de1-f089-46b5-99e4-b249fbe09d65 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:53.137920623 +0000 UTC m=+1078.062410255 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs") pod "openstack-operator-controller-manager-7c6777f5bd-z474b" (UID: "83918de1-f089-46b5-99e4-b249fbe09d65") : secret "webhook-server-cert" not found Jan 21 15:41:51 crc kubenswrapper[4773]: E0121 15:41:51.138016 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs podName:83918de1-f089-46b5-99e4-b249fbe09d65 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:53.137997865 +0000 UTC m=+1078.062487607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs") pod "openstack-operator-controller-manager-7c6777f5bd-z474b" (UID: "83918de1-f089-46b5-99e4-b249fbe09d65") : secret "metrics-server-cert" not found Jan 21 15:41:51 crc kubenswrapper[4773]: E0121 15:41:51.755716 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" podUID="9a48a802-404e-4a60-821b-8b91a4830da8" Jan 21 15:41:51 crc kubenswrapper[4773]: E0121 15:41:51.755718 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" podUID="7ab140ad-f64b-45e3-a393-f66567e98a9f" Jan 21 15:41:51 crc kubenswrapper[4773]: E0121 15:41:51.756628 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7" podUID="02b38141-e855-4eeb-ac52-d135fb5f44f7" Jan 21 15:41:51 crc kubenswrapper[4773]: E0121 15:41:51.756652 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" podUID="ddac57c3-b102-4cfc-8b1e-53de342cef39" Jan 21 15:41:51 crc kubenswrapper[4773]: E0121 15:41:51.756677 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" podUID="d7b52472-5c30-471f-a937-c50d96103339" Jan 21 15:41:51 crc kubenswrapper[4773]: E0121 15:41:51.756728 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.179:5001/openstack-k8s-operators/telemetry-operator:eb64f15362ce8fe083224b8876330e95b4455acc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" podUID="4a9d0079-9636-4913-95fd-305e8d54280d" Jan 21 15:41:51 crc kubenswrapper[4773]: E0121 15:41:51.756767 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" podUID="6411a5d3-7b7b-4735-b01c-7c4aa0d5509c" Jan 21 15:41:52 crc kubenswrapper[4773]: I0121 15:41:52.257591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:41:52 crc kubenswrapper[4773]: E0121 15:41:52.257814 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:41:52 crc kubenswrapper[4773]: E0121 15:41:52.257923 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert podName:fdfe2fce-12c1-4026-b40f-77234a609986 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:56.257898957 +0000 UTC m=+1081.182388619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert") pod "infra-operator-controller-manager-77c48c7859-pldbp" (UID: "fdfe2fce-12c1-4026-b40f-77234a609986") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:41:52 crc kubenswrapper[4773]: I0121 15:41:52.561646 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz\" (UID: \"d8cb173b-7eaa-4183-8028-0a1c4730097c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:41:52 crc kubenswrapper[4773]: E0121 15:41:52.561816 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:41:52 crc kubenswrapper[4773]: E0121 15:41:52.562234 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert podName:d8cb173b-7eaa-4183-8028-0a1c4730097c nodeName:}" failed. No retries permitted until 2026-01-21 15:41:56.562205462 +0000 UTC m=+1081.486695084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" (UID: "d8cb173b-7eaa-4183-8028-0a1c4730097c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:41:53 crc kubenswrapper[4773]: I0121 15:41:53.172367 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:53 crc kubenswrapper[4773]: I0121 15:41:53.172468 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:53 crc kubenswrapper[4773]: E0121 15:41:53.172551 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 15:41:53 crc kubenswrapper[4773]: E0121 15:41:53.172551 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 15:41:53 crc kubenswrapper[4773]: E0121 15:41:53.172709 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs podName:83918de1-f089-46b5-99e4-b249fbe09d65 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:57.172677714 +0000 UTC m=+1082.097167336 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs") pod "openstack-operator-controller-manager-7c6777f5bd-z474b" (UID: "83918de1-f089-46b5-99e4-b249fbe09d65") : secret "webhook-server-cert" not found Jan 21 15:41:53 crc kubenswrapper[4773]: E0121 15:41:53.172726 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs podName:83918de1-f089-46b5-99e4-b249fbe09d65 nodeName:}" failed. No retries permitted until 2026-01-21 15:41:57.172719765 +0000 UTC m=+1082.097209387 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs") pod "openstack-operator-controller-manager-7c6777f5bd-z474b" (UID: "83918de1-f089-46b5-99e4-b249fbe09d65") : secret "metrics-server-cert" not found Jan 21 15:41:56 crc kubenswrapper[4773]: I0121 15:41:56.320571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:41:56 crc kubenswrapper[4773]: E0121 15:41:56.320754 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:41:56 crc kubenswrapper[4773]: E0121 15:41:56.320943 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert podName:fdfe2fce-12c1-4026-b40f-77234a609986 nodeName:}" failed. No retries permitted until 2026-01-21 15:42:04.320928542 +0000 UTC m=+1089.245418164 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert") pod "infra-operator-controller-manager-77c48c7859-pldbp" (UID: "fdfe2fce-12c1-4026-b40f-77234a609986") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:41:56 crc kubenswrapper[4773]: I0121 15:41:56.623468 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz\" (UID: \"d8cb173b-7eaa-4183-8028-0a1c4730097c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:41:56 crc kubenswrapper[4773]: E0121 15:41:56.623623 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:41:56 crc kubenswrapper[4773]: E0121 15:41:56.623729 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert podName:d8cb173b-7eaa-4183-8028-0a1c4730097c nodeName:}" failed. No retries permitted until 2026-01-21 15:42:04.623686404 +0000 UTC m=+1089.548176026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" (UID: "d8cb173b-7eaa-4183-8028-0a1c4730097c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 15:41:57 crc kubenswrapper[4773]: I0121 15:41:57.232131 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:57 crc kubenswrapper[4773]: I0121 15:41:57.232241 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:41:57 crc kubenswrapper[4773]: E0121 15:41:57.232310 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 15:41:57 crc kubenswrapper[4773]: E0121 15:41:57.232374 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 15:41:57 crc kubenswrapper[4773]: E0121 15:41:57.232399 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs podName:83918de1-f089-46b5-99e4-b249fbe09d65 nodeName:}" failed. No retries permitted until 2026-01-21 15:42:05.232381947 +0000 UTC m=+1090.156871569 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs") pod "openstack-operator-controller-manager-7c6777f5bd-z474b" (UID: "83918de1-f089-46b5-99e4-b249fbe09d65") : secret "metrics-server-cert" not found Jan 21 15:41:57 crc kubenswrapper[4773]: E0121 15:41:57.232447 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs podName:83918de1-f089-46b5-99e4-b249fbe09d65 nodeName:}" failed. No retries permitted until 2026-01-21 15:42:05.232427229 +0000 UTC m=+1090.156916861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs") pod "openstack-operator-controller-manager-7c6777f5bd-z474b" (UID: "83918de1-f089-46b5-99e4-b249fbe09d65") : secret "webhook-server-cert" not found Jan 21 15:42:03 crc kubenswrapper[4773]: E0121 15:42:03.424656 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:0d59a405f50b37c833e14c0f4987e95c8769d9ab06a7087078bdd02568c18ca8" Jan 21 15:42:03 crc kubenswrapper[4773]: E0121 15:42:03.425295 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:0d59a405f50b37c833e14c0f4987e95c8769d9ab06a7087078bdd02568c18ca8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4j6rd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-9f958b845-s7bsl_openstack-operators(516d7adc-2317-406b-92ef-6ed5a74a74b3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:03 crc kubenswrapper[4773]: E0121 15:42:03.426742 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" podUID="516d7adc-2317-406b-92ef-6ed5a74a74b3" Jan 21 15:42:03 crc kubenswrapper[4773]: E0121 15:42:03.835231 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:0d59a405f50b37c833e14c0f4987e95c8769d9ab06a7087078bdd02568c18ca8\\\"\"" pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" podUID="516d7adc-2317-406b-92ef-6ed5a74a74b3" Jan 21 15:42:04 crc kubenswrapper[4773]: I0121 15:42:04.339216 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:42:04 crc kubenswrapper[4773]: E0121 15:42:04.339380 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 15:42:04 crc kubenswrapper[4773]: E0121 15:42:04.339449 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert podName:fdfe2fce-12c1-4026-b40f-77234a609986 nodeName:}" failed. No retries permitted until 2026-01-21 15:42:20.339429421 +0000 UTC m=+1105.263919033 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert") pod "infra-operator-controller-manager-77c48c7859-pldbp" (UID: "fdfe2fce-12c1-4026-b40f-77234a609986") : secret "infra-operator-webhook-server-cert" not found Jan 21 15:42:04 crc kubenswrapper[4773]: I0121 15:42:04.641875 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz\" (UID: \"d8cb173b-7eaa-4183-8028-0a1c4730097c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:42:04 crc kubenswrapper[4773]: I0121 15:42:04.653741 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8cb173b-7eaa-4183-8028-0a1c4730097c-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz\" (UID: \"d8cb173b-7eaa-4183-8028-0a1c4730097c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:42:04 crc kubenswrapper[4773]: I0121 15:42:04.726352 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gdm7s" Jan 21 15:42:04 crc kubenswrapper[4773]: I0121 15:42:04.734905 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:42:05 crc kubenswrapper[4773]: I0121 15:42:05.247447 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:42:05 crc kubenswrapper[4773]: I0121 15:42:05.247603 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:42:05 crc kubenswrapper[4773]: I0121 15:42:05.252798 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-webhook-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:42:05 crc kubenswrapper[4773]: I0121 15:42:05.254196 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83918de1-f089-46b5-99e4-b249fbe09d65-metrics-certs\") pod \"openstack-operator-controller-manager-7c6777f5bd-z474b\" (UID: \"83918de1-f089-46b5-99e4-b249fbe09d65\") " pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:42:05 crc kubenswrapper[4773]: I0121 15:42:05.286216 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nr48l" Jan 21 15:42:05 crc kubenswrapper[4773]: I0121 15:42:05.295342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:42:13 crc kubenswrapper[4773]: E0121 15:42:13.164567 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492" Jan 21 15:42:13 crc kubenswrapper[4773]: E0121 15:42:13.165294 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ngzn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-594c8c9d5d-hpqt5_openstack-operators(8a54ffaf-3268-4696-952e-ee6381310628): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:13 crc kubenswrapper[4773]: E0121 15:42:13.166529 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" podUID="8a54ffaf-3268-4696-952e-ee6381310628" Jan 21 15:42:13 crc kubenswrapper[4773]: E0121 15:42:13.904171 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492\\\"\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" podUID="8a54ffaf-3268-4696-952e-ee6381310628" Jan 21 15:42:14 crc kubenswrapper[4773]: E0121 15:42:14.056551 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:d69a68cdac59165797daf1064f3a3b4b14b546bf1c7254070a7ed1238998c028" Jan 21 15:42:14 crc kubenswrapper[4773]: E0121 15:42:14.056772 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:d69a68cdac59165797daf1064f3a3b4b14b546bf1c7254070a7ed1238998c028,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bxx8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-c6994669c-7kgnl_openstack-operators(ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:14 crc kubenswrapper[4773]: E0121 15:42:14.057989 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" podUID="ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c" Jan 21 15:42:14 crc kubenswrapper[4773]: E0121 15:42:14.908967 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:d69a68cdac59165797daf1064f3a3b4b14b546bf1c7254070a7ed1238998c028\\\"\"" pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" podUID="ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c" Jan 21 15:42:17 crc kubenswrapper[4773]: E0121 15:42:17.080929 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525" Jan 21 15:42:17 crc kubenswrapper[4773]: E0121 15:42:17.081083 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8x9qv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-78757b4889-c79x5_openstack-operators(32f1de73-4ee0-4eda-8709-d1642d8452f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:17 crc kubenswrapper[4773]: E0121 15:42:17.082212 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" podUID="32f1de73-4ee0-4eda-8709-d1642d8452f2" Jan 21 15:42:17 crc kubenswrapper[4773]: E0121 15:42:17.925192 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" podUID="32f1de73-4ee0-4eda-8709-d1642d8452f2" Jan 21 15:42:19 crc kubenswrapper[4773]: E0121 15:42:19.818063 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488" Jan 21 15:42:19 crc kubenswrapper[4773]: E0121 15:42:19.818234 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mbswf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-9b68f5989-lvnp8_openstack-operators(8678664b-38d8-4482-ae3d-fa1a74a709fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:19 crc kubenswrapper[4773]: E0121 15:42:19.819455 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" podUID="8678664b-38d8-4482-ae3d-fa1a74a709fd" Jan 21 15:42:19 crc kubenswrapper[4773]: E0121 15:42:19.942151 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" podUID="8678664b-38d8-4482-ae3d-fa1a74a709fd" Jan 21 15:42:20 crc kubenswrapper[4773]: I0121 15:42:20.382410 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:42:20 crc kubenswrapper[4773]: I0121 15:42:20.389877 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdfe2fce-12c1-4026-b40f-77234a609986-cert\") pod \"infra-operator-controller-manager-77c48c7859-pldbp\" (UID: \"fdfe2fce-12c1-4026-b40f-77234a609986\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:42:20 crc kubenswrapper[4773]: I0121 15:42:20.603669 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zp7dj" Jan 21 15:42:20 crc kubenswrapper[4773]: I0121 15:42:20.611921 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:42:21 crc kubenswrapper[4773]: E0121 15:42:21.580845 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a" Jan 21 15:42:21 crc kubenswrapper[4773]: E0121 15:42:21.581729 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rwcxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7ddb5c749-h8g4t_openstack-operators(330099a5-d43b-482e-a4cb-e6c3bb2c6706): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:21 crc kubenswrapper[4773]: E0121 15:42:21.582979 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" podUID="330099a5-d43b-482e-a4cb-e6c3bb2c6706" Jan 21 15:42:21 crc kubenswrapper[4773]: E0121 15:42:21.955870 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" podUID="330099a5-d43b-482e-a4cb-e6c3bb2c6706" Jan 21 15:42:22 crc kubenswrapper[4773]: E0121 15:42:22.553656 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231" Jan 21 15:42:22 crc kubenswrapper[4773]: E0121 15:42:22.553875 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n58f9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-65849867d6-nwvmf_openstack-operators(b7286f1c-434c-4ebb-9d2a-54a6596a63b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:22 crc kubenswrapper[4773]: E0121 15:42:22.555073 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" podUID="b7286f1c-434c-4ebb-9d2a-54a6596a63b5" Jan 21 15:42:22 crc kubenswrapper[4773]: E0121 15:42:22.960265 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231\\\"\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" podUID="b7286f1c-434c-4ebb-9d2a-54a6596a63b5" Jan 21 15:42:23 crc kubenswrapper[4773]: E0121 15:42:23.293375 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:3311e627bcb860d9443592a2c67078417318c9eb77d8ef4d07f9aa7027d46822" Jan 21 15:42:23 crc kubenswrapper[4773]: E0121 15:42:23.293923 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:3311e627bcb860d9443592a2c67078417318c9eb77d8ef4d07f9aa7027d46822,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nrpnh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-77d5c5b54f-wrww2_openstack-operators(50f1e60f-1194-428b-b7e2-ccf0ebb384c7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:23 crc kubenswrapper[4773]: E0121 15:42:23.295218 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" podUID="50f1e60f-1194-428b-b7e2-ccf0ebb384c7" Jan 21 15:42:23 crc kubenswrapper[4773]: E0121 15:42:23.966778 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:3311e627bcb860d9443592a2c67078417318c9eb77d8ef4d07f9aa7027d46822\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" podUID="50f1e60f-1194-428b-b7e2-ccf0ebb384c7" Jan 21 15:42:24 crc kubenswrapper[4773]: E0121 15:42:24.027825 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf" Jan 21 15:42:24 crc kubenswrapper[4773]: E0121 15:42:24.028005 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mp27z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-55db956ddc-cn7gp_openstack-operators(0f906590-d519-4724-bc67-05c6b3a9191d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:24 crc kubenswrapper[4773]: E0121 15:42:24.029960 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" podUID="0f906590-d519-4724-bc67-05c6b3a9191d" Jan 21 15:42:24 crc kubenswrapper[4773]: E0121 15:42:24.629677 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e" Jan 21 15:42:24 crc kubenswrapper[4773]: E0121 15:42:24.630040 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cbx92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7cd8bc9dbb-94knv_openstack-operators(eeb5c272-4544-47a4-8d08-187872fea7bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:24 crc kubenswrapper[4773]: E0121 15:42:24.631320 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" podUID="eeb5c272-4544-47a4-8d08-187872fea7bd" Jan 21 15:42:24 crc kubenswrapper[4773]: E0121 15:42:24.974525 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" podUID="eeb5c272-4544-47a4-8d08-187872fea7bd" Jan 21 15:42:24 crc kubenswrapper[4773]: E0121 15:42:24.974543 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" podUID="0f906590-d519-4724-bc67-05c6b3a9191d" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.030913 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.031124 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mrw4z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-767fdc4f47-fcbjv_openstack-operators(d60c449c-a583-4b8e-8265-9df068220041): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.032293 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" podUID="d60c449c-a583-4b8e-8265-9df068220041" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.467967 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.468137 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2lfcd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-c87fff755-jz6m4_openstack-operators(43aaba2b-296a-407d-9ea2-bbf4c05e868e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.469337 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" podUID="43aaba2b-296a-407d-9ea2-bbf4c05e868e" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.902483 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.902958 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bkhk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-864f6b75bf-xdcfb_openstack-operators(2264dd36-5855-49cc-bf31-1d1e9dcb1f9f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.904985 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" podUID="2264dd36-5855-49cc-bf31-1d1e9dcb1f9f" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.979147 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32\\\"\"" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" podUID="2264dd36-5855-49cc-bf31-1d1e9dcb1f9f" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.979332 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" podUID="43aaba2b-296a-407d-9ea2-bbf4c05e868e" Jan 21 15:42:25 crc kubenswrapper[4773]: E0121 15:42:25.979739 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" podUID="d60c449c-a583-4b8e-8265-9df068220041" Jan 21 15:42:31 crc kubenswrapper[4773]: I0121 15:42:31.793680 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b"] Jan 21 15:42:31 crc kubenswrapper[4773]: I0121 15:42:31.869556 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp"] Jan 21 15:42:31 crc kubenswrapper[4773]: I0121 15:42:31.891415 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz"] Jan 21 15:42:31 crc kubenswrapper[4773]: W0121 15:42:31.900148 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8cb173b_7eaa_4183_8028_0a1c4730097c.slice/crio-f2aba1d9e98ebc9eb8d8443875d209e587fcd07acb4c9cf3c2fee5c9022c8bff WatchSource:0}: Error finding container f2aba1d9e98ebc9eb8d8443875d209e587fcd07acb4c9cf3c2fee5c9022c8bff: Status 404 returned error can't find the container with id f2aba1d9e98ebc9eb8d8443875d209e587fcd07acb4c9cf3c2fee5c9022c8bff Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.046155 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" event={"ID":"ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c","Type":"ContainerStarted","Data":"d9451c00ab3b9371159b3a1e0498201ea40d23974d03c94d4bb603aa6264c9b5"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.047652 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.051203 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" event={"ID":"ddac57c3-b102-4cfc-8b1e-53de342cef39","Type":"ContainerStarted","Data":"e1b3cc99fef6273461589e900133af9bcfa65f3809f9d55a0102370fa453455d"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.052284 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.065892 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" event={"ID":"4a9d0079-9636-4913-95fd-305e8d54280d","Type":"ContainerStarted","Data":"b30e5d29c6968c1453f8a5cdb706b278f52af491800f0b33694b809afa134d6b"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.066323 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.071514 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" event={"ID":"fdfe2fce-12c1-4026-b40f-77234a609986","Type":"ContainerStarted","Data":"990f9d91e582b13524b231eca0e95c991ec50b8cc1ebb32324dce90ec8ce0e57"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.079103 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" event={"ID":"83918de1-f089-46b5-99e4-b249fbe09d65","Type":"ContainerStarted","Data":"78b749ef90450d79bd39417bab67f6ce73d6f81f58d61f162d63c9dab7988389"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.079292 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.097976 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" podStartSLOduration=2.820137313 podStartE2EDuration="44.097957405s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.138989939 +0000 UTC m=+1075.063479561" lastFinishedPulling="2026-01-21 15:42:31.416810031 +0000 UTC m=+1116.341299653" observedRunningTime="2026-01-21 15:42:32.06996834 +0000 UTC m=+1116.994457962" watchObservedRunningTime="2026-01-21 15:42:32.097957405 +0000 UTC m=+1117.022447027" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.099216 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7" event={"ID":"02b38141-e855-4eeb-ac52-d135fb5f44f7","Type":"ContainerStarted","Data":"6c060a17423289fa90ad844d40a15449ee3fd37f91f06c0b39702321ccec9745"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.114846 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" event={"ID":"6411a5d3-7b7b-4735-b01c-7c4aa0d5509c","Type":"ContainerStarted","Data":"6b15ed28eebd444bffddf280c3547d1d8b2b32b303a128f223e326d6c382b558"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.126243 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" podStartSLOduration=3.207436565 podStartE2EDuration="44.126221288s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.574669977 +0000 UTC m=+1075.499159599" lastFinishedPulling="2026-01-21 15:42:31.4934547 +0000 UTC m=+1116.417944322" observedRunningTime="2026-01-21 15:42:32.095100608 +0000 UTC m=+1117.019590220" watchObservedRunningTime="2026-01-21 15:42:32.126221288 +0000 UTC m=+1117.050710910" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.136097 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" event={"ID":"9a48a802-404e-4a60-821b-8b91a4830da8","Type":"ContainerStarted","Data":"d3872c75be7e463170bf0306675971a2d437e2d9e6b1dcb90fac9c35ffd92e63"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.136932 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.148584 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" event={"ID":"7ab140ad-f64b-45e3-a393-f66567e98a9f","Type":"ContainerStarted","Data":"65c6ab159470fa9b09831e4a9745016e549d661037f99b39c8060ec2fdabe1ea"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.149321 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.154250 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" podStartSLOduration=3.263445499 podStartE2EDuration="44.154229704s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.395776581 +0000 UTC m=+1075.320266203" lastFinishedPulling="2026-01-21 15:42:31.286560786 +0000 UTC m=+1116.211050408" observedRunningTime="2026-01-21 15:42:32.137306927 +0000 UTC m=+1117.061796549" watchObservedRunningTime="2026-01-21 15:42:32.154229704 +0000 UTC m=+1117.078719326" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.158094 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" event={"ID":"516d7adc-2317-406b-92ef-6ed5a74a74b3","Type":"ContainerStarted","Data":"dac3b004221b9db44ccbd021ad0078e2396087fd7ffa609f3218ed6454199eab"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.158791 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.167260 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" event={"ID":"d8cb173b-7eaa-4183-8028-0a1c4730097c","Type":"ContainerStarted","Data":"f2aba1d9e98ebc9eb8d8443875d209e587fcd07acb4c9cf3c2fee5c9022c8bff"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.174437 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" podStartSLOduration=3.365365298 podStartE2EDuration="44.174421259s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.574953564 +0000 UTC m=+1075.499443196" lastFinishedPulling="2026-01-21 15:42:31.384009535 +0000 UTC m=+1116.308499157" observedRunningTime="2026-01-21 15:42:32.167076041 +0000 UTC m=+1117.091565663" watchObservedRunningTime="2026-01-21 15:42:32.174421259 +0000 UTC m=+1117.098910871" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.175895 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" event={"ID":"d7b52472-5c30-471f-a937-c50d96103339","Type":"ContainerStarted","Data":"e98c1d5bea24cb24fcf04b527f5f1e346f5bd776121e065701636634a98b1c13"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.176686 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.184992 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" event={"ID":"32f1de73-4ee0-4eda-8709-d1642d8452f2","Type":"ContainerStarted","Data":"b8a940e20c1b234f977ea1ee3559836d89597f4e87057cdae07793b34ffa8187"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.185618 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.188971 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" event={"ID":"8a54ffaf-3268-4696-952e-ee6381310628","Type":"ContainerStarted","Data":"5bd383b76bcf85d2805cb2e3e907c989b24ec8ea30bc8cb9b61a38b41491009b"} Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.189387 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.198431 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" podStartSLOduration=3.388829731 podStartE2EDuration="44.198416107s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.574836061 +0000 UTC m=+1075.499325683" lastFinishedPulling="2026-01-21 15:42:31.384422437 +0000 UTC m=+1116.308912059" observedRunningTime="2026-01-21 15:42:32.198101658 +0000 UTC m=+1117.122591280" watchObservedRunningTime="2026-01-21 15:42:32.198416107 +0000 UTC m=+1117.122905729" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.224683 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dp2s7" podStartSLOduration=2.382535792 podStartE2EDuration="43.224664075s" podCreationTimestamp="2026-01-21 15:41:49 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.574548914 +0000 UTC m=+1075.499038536" lastFinishedPulling="2026-01-21 15:42:31.416677187 +0000 UTC m=+1116.341166819" observedRunningTime="2026-01-21 15:42:32.224281685 +0000 UTC m=+1117.148771307" watchObservedRunningTime="2026-01-21 15:42:32.224664075 +0000 UTC m=+1117.149153697" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.269674 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" podStartSLOduration=43.26965675 podStartE2EDuration="43.26965675s" podCreationTimestamp="2026-01-21 15:41:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:42:32.263817602 +0000 UTC m=+1117.188307234" watchObservedRunningTime="2026-01-21 15:42:32.26965675 +0000 UTC m=+1117.194146362" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.302723 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" podStartSLOduration=3.07540409 podStartE2EDuration="44.302705412s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:49.96945863 +0000 UTC m=+1074.893948252" lastFinishedPulling="2026-01-21 15:42:31.196759952 +0000 UTC m=+1116.121249574" observedRunningTime="2026-01-21 15:42:32.297418669 +0000 UTC m=+1117.221908291" watchObservedRunningTime="2026-01-21 15:42:32.302705412 +0000 UTC m=+1117.227195034" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.321191 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" podStartSLOduration=3.507381241 podStartE2EDuration="44.32117516s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.575277743 +0000 UTC m=+1075.499767365" lastFinishedPulling="2026-01-21 15:42:31.389071662 +0000 UTC m=+1116.313561284" observedRunningTime="2026-01-21 15:42:32.318626961 +0000 UTC m=+1117.243116603" watchObservedRunningTime="2026-01-21 15:42:32.32117516 +0000 UTC m=+1117.245664782" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.336513 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" podStartSLOduration=3.074034716 podStartE2EDuration="44.336493804s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.16054129 +0000 UTC m=+1075.085030912" lastFinishedPulling="2026-01-21 15:42:31.423000378 +0000 UTC m=+1116.347490000" observedRunningTime="2026-01-21 15:42:32.333899734 +0000 UTC m=+1117.258389376" watchObservedRunningTime="2026-01-21 15:42:32.336493804 +0000 UTC m=+1117.260983426" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.367657 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" podStartSLOduration=3.076398467 podStartE2EDuration="44.367641504s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.133030575 +0000 UTC m=+1075.057520197" lastFinishedPulling="2026-01-21 15:42:31.424273612 +0000 UTC m=+1116.348763234" observedRunningTime="2026-01-21 15:42:32.358908489 +0000 UTC m=+1117.283398111" watchObservedRunningTime="2026-01-21 15:42:32.367641504 +0000 UTC m=+1117.292131116" Jan 21 15:42:32 crc kubenswrapper[4773]: I0121 15:42:32.396075 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" podStartSLOduration=3.580164385 podStartE2EDuration="44.396059491s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.574335888 +0000 UTC m=+1075.498825510" lastFinishedPulling="2026-01-21 15:42:31.390230994 +0000 UTC m=+1116.314720616" observedRunningTime="2026-01-21 15:42:32.388580759 +0000 UTC m=+1117.313070391" watchObservedRunningTime="2026-01-21 15:42:32.396059491 +0000 UTC m=+1117.320549113" Jan 21 15:42:33 crc kubenswrapper[4773]: I0121 15:42:33.199562 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" event={"ID":"83918de1-f089-46b5-99e4-b249fbe09d65","Type":"ContainerStarted","Data":"2b7ef5f159947b733acd00ac1e6cac4912faafd514203b87f141f5e37c1b9593"} Jan 21 15:42:35 crc kubenswrapper[4773]: I0121 15:42:35.216019 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" event={"ID":"fdfe2fce-12c1-4026-b40f-77234a609986","Type":"ContainerStarted","Data":"f3b61d3ceec805bd521f622465b93b1c5fd69db506f619859d266a6e2324bfb3"} Jan 21 15:42:35 crc kubenswrapper[4773]: I0121 15:42:35.217048 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:42:35 crc kubenswrapper[4773]: I0121 15:42:35.217451 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" event={"ID":"d8cb173b-7eaa-4183-8028-0a1c4730097c","Type":"ContainerStarted","Data":"0b19b670bef9c4a202ea9fc1002429def9982670adf36293952817b0d3aebba5"} Jan 21 15:42:35 crc kubenswrapper[4773]: I0121 15:42:35.217648 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:42:35 crc kubenswrapper[4773]: I0121 15:42:35.219119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" event={"ID":"8678664b-38d8-4482-ae3d-fa1a74a709fd","Type":"ContainerStarted","Data":"488b5a1c5afa5fcbd95eef664b34a8c17504cb66bb34011abc4dc5e072bfa25f"} Jan 21 15:42:35 crc kubenswrapper[4773]: I0121 15:42:35.219321 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" Jan 21 15:42:35 crc kubenswrapper[4773]: I0121 15:42:35.266462 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" podStartSLOduration=44.281972721 podStartE2EDuration="47.266442264s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:42:31.902953122 +0000 UTC m=+1116.827442744" lastFinishedPulling="2026-01-21 15:42:34.887422665 +0000 UTC m=+1119.811912287" observedRunningTime="2026-01-21 15:42:35.266031503 +0000 UTC m=+1120.190521145" watchObservedRunningTime="2026-01-21 15:42:35.266442264 +0000 UTC m=+1120.190931886" Jan 21 15:42:35 crc kubenswrapper[4773]: I0121 15:42:35.267202 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" podStartSLOduration=44.270875743 podStartE2EDuration="47.267196425s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:42:31.891116733 +0000 UTC m=+1116.815606355" lastFinishedPulling="2026-01-21 15:42:34.887437415 +0000 UTC m=+1119.811927037" observedRunningTime="2026-01-21 15:42:35.240090972 +0000 UTC m=+1120.164580604" watchObservedRunningTime="2026-01-21 15:42:35.267196425 +0000 UTC m=+1120.191686047" Jan 21 15:42:35 crc kubenswrapper[4773]: I0121 15:42:35.409128 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" podStartSLOduration=2.509962807 podStartE2EDuration="47.409111985s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:49.991023331 +0000 UTC m=+1074.915512953" lastFinishedPulling="2026-01-21 15:42:34.890172519 +0000 UTC m=+1119.814662131" observedRunningTime="2026-01-21 15:42:35.295517699 +0000 UTC m=+1120.220007321" watchObservedRunningTime="2026-01-21 15:42:35.409111985 +0000 UTC m=+1120.333601607" Jan 21 15:42:37 crc kubenswrapper[4773]: I0121 15:42:37.233148 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" event={"ID":"b7286f1c-434c-4ebb-9d2a-54a6596a63b5","Type":"ContainerStarted","Data":"71671b5c7ba9100ae663ce488a20fa59f164e5ea6a065f1acd8146e3b6e4be90"} Jan 21 15:42:38 crc kubenswrapper[4773]: I0121 15:42:38.672205 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-9f958b845-s7bsl" Jan 21 15:42:38 crc kubenswrapper[4773]: I0121 15:42:38.722227 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-c6994669c-7kgnl" Jan 21 15:42:38 crc kubenswrapper[4773]: I0121 15:42:38.773904 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-hpqt5" Jan 21 15:42:38 crc kubenswrapper[4773]: I0121 15:42:38.891340 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" Jan 21 15:42:39 crc kubenswrapper[4773]: I0121 15:42:39.045595 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d9k6n" Jan 21 15:42:39 crc kubenswrapper[4773]: I0121 15:42:39.240470 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" Jan 21 15:42:39 crc kubenswrapper[4773]: I0121 15:42:39.245231 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" Jan 21 15:42:39 crc kubenswrapper[4773]: I0121 15:42:39.285907 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" podStartSLOduration=5.70140188 podStartE2EDuration="51.285871281s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.366966641 +0000 UTC m=+1075.291456263" lastFinishedPulling="2026-01-21 15:42:35.951436042 +0000 UTC m=+1120.875925664" observedRunningTime="2026-01-21 15:42:39.274791711 +0000 UTC m=+1124.199281373" watchObservedRunningTime="2026-01-21 15:42:39.285871281 +0000 UTC m=+1124.210360913" Jan 21 15:42:39 crc kubenswrapper[4773]: I0121 15:42:39.339134 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-2wbz9" Jan 21 15:42:39 crc kubenswrapper[4773]: I0121 15:42:39.458447 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" Jan 21 15:42:39 crc kubenswrapper[4773]: I0121 15:42:39.461321 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" Jan 21 15:42:39 crc kubenswrapper[4773]: I0121 15:42:39.486025 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" Jan 21 15:42:39 crc kubenswrapper[4773]: I0121 15:42:39.545180 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.252666 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" event={"ID":"50f1e60f-1194-428b-b7e2-ccf0ebb384c7","Type":"ContainerStarted","Data":"ff74ad6b3a899af05a0ac5e161d6f9b47fce992be044c5d7d9c994d5271d0f85"} Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.253135 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.255413 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" event={"ID":"2264dd36-5855-49cc-bf31-1d1e9dcb1f9f","Type":"ContainerStarted","Data":"8412918a2ecb74a7aeb7910a1f1d475cc14bd5714b7f79058bc8de796eb80f30"} Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.255605 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.258304 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" event={"ID":"d60c449c-a583-4b8e-8265-9df068220041","Type":"ContainerStarted","Data":"2b271bac413686ba6a7ce765e512001b50635fe27f5ed68490eeacdd96b569c7"} Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.258454 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.260104 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" event={"ID":"330099a5-d43b-482e-a4cb-e6c3bb2c6706","Type":"ContainerStarted","Data":"7f33d3053a456c105e7e50c7311b097b3448a69ecfbd25d133174b98fdba021f"} Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.260248 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.261282 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" event={"ID":"eeb5c272-4544-47a4-8d08-187872fea7bd","Type":"ContainerStarted","Data":"f70f54f0a982017893ff69fda35238edecc629ac57c7578a1e297a66bd5073cb"} Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.261591 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.270880 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" podStartSLOduration=2.805401761 podStartE2EDuration="52.270864925s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.359963899 +0000 UTC m=+1075.284453521" lastFinishedPulling="2026-01-21 15:42:39.825427043 +0000 UTC m=+1124.749916685" observedRunningTime="2026-01-21 15:42:40.266667942 +0000 UTC m=+1125.191157564" watchObservedRunningTime="2026-01-21 15:42:40.270864925 +0000 UTC m=+1125.195354547" Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.296636 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" podStartSLOduration=2.6785571409999998 podStartE2EDuration="52.296621971s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.113482169 +0000 UTC m=+1075.037971791" lastFinishedPulling="2026-01-21 15:42:39.731546989 +0000 UTC m=+1124.656036621" observedRunningTime="2026-01-21 15:42:40.280890547 +0000 UTC m=+1125.205380169" watchObservedRunningTime="2026-01-21 15:42:40.296621971 +0000 UTC m=+1125.221111593" Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.298971 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" podStartSLOduration=3.137142187 podStartE2EDuration="52.298964675s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.570372469 +0000 UTC m=+1075.494862091" lastFinishedPulling="2026-01-21 15:42:39.732194957 +0000 UTC m=+1124.656684579" observedRunningTime="2026-01-21 15:42:40.292967852 +0000 UTC m=+1125.217457474" watchObservedRunningTime="2026-01-21 15:42:40.298964675 +0000 UTC m=+1125.223454297" Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.309830 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" podStartSLOduration=2.803564786 podStartE2EDuration="52.309815897s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.39468383 +0000 UTC m=+1075.319173452" lastFinishedPulling="2026-01-21 15:42:39.900934941 +0000 UTC m=+1124.825424563" observedRunningTime="2026-01-21 15:42:40.30656483 +0000 UTC m=+1125.231054452" watchObservedRunningTime="2026-01-21 15:42:40.309815897 +0000 UTC m=+1125.234305519" Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.345756 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" podStartSLOduration=3.205839877 podStartE2EDuration="52.345736117s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.14632412 +0000 UTC m=+1075.070813742" lastFinishedPulling="2026-01-21 15:42:39.28622037 +0000 UTC m=+1124.210709982" observedRunningTime="2026-01-21 15:42:40.34326763 +0000 UTC m=+1125.267757262" watchObservedRunningTime="2026-01-21 15:42:40.345736117 +0000 UTC m=+1125.270225739" Jan 21 15:42:40 crc kubenswrapper[4773]: I0121 15:42:40.621710 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-pldbp" Jan 21 15:42:43 crc kubenswrapper[4773]: I0121 15:42:43.283515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" event={"ID":"43aaba2b-296a-407d-9ea2-bbf4c05e868e","Type":"ContainerStarted","Data":"e91c66b948d445a876e4996e2021683a363d39aec462ea5130b92115fd358f66"} Jan 21 15:42:43 crc kubenswrapper[4773]: I0121 15:42:43.284348 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" Jan 21 15:42:43 crc kubenswrapper[4773]: I0121 15:42:43.308488 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" podStartSLOduration=3.6714565710000002 podStartE2EDuration="55.308465912s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.326660595 +0000 UTC m=+1075.251150217" lastFinishedPulling="2026-01-21 15:42:41.963669936 +0000 UTC m=+1126.888159558" observedRunningTime="2026-01-21 15:42:43.304752131 +0000 UTC m=+1128.229241763" watchObservedRunningTime="2026-01-21 15:42:43.308465912 +0000 UTC m=+1128.232955544" Jan 21 15:42:44 crc kubenswrapper[4773]: I0121 15:42:44.743337 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 15:42:45 crc kubenswrapper[4773]: I0121 15:42:45.302512 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 15:42:46 crc kubenswrapper[4773]: I0121 15:42:46.309124 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" event={"ID":"0f906590-d519-4724-bc67-05c6b3a9191d","Type":"ContainerStarted","Data":"001597db876e3957533ca86637dfc63ed915ea51e80457d8039af5b31b56605f"} Jan 21 15:42:46 crc kubenswrapper[4773]: I0121 15:42:46.309657 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" Jan 21 15:42:46 crc kubenswrapper[4773]: I0121 15:42:46.332022 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" podStartSLOduration=2.930047946 podStartE2EDuration="58.332003848s" podCreationTimestamp="2026-01-21 15:41:48 +0000 UTC" firstStartedPulling="2026-01-21 15:41:50.336213518 +0000 UTC m=+1075.260703140" lastFinishedPulling="2026-01-21 15:42:45.73816942 +0000 UTC m=+1130.662659042" observedRunningTime="2026-01-21 15:42:46.326233302 +0000 UTC m=+1131.250722924" watchObservedRunningTime="2026-01-21 15:42:46.332003848 +0000 UTC m=+1131.256493470" Jan 21 15:42:48 crc kubenswrapper[4773]: I0121 15:42:48.635865 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-h8g4t" Jan 21 15:42:48 crc kubenswrapper[4773]: I0121 15:42:48.650041 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-lvnp8" Jan 21 15:42:48 crc kubenswrapper[4773]: I0121 15:42:48.788880 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-wrww2" Jan 21 15:42:48 crc kubenswrapper[4773]: I0121 15:42:48.903397 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-fcbjv" Jan 21 15:42:48 crc kubenswrapper[4773]: I0121 15:42:48.989482 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-xdcfb" Jan 21 15:42:49 crc kubenswrapper[4773]: I0121 15:42:49.018826 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-jz6m4" Jan 21 15:42:49 crc kubenswrapper[4773]: I0121 15:42:49.186378 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-65849867d6-nwvmf" Jan 21 15:42:49 crc kubenswrapper[4773]: I0121 15:42:49.557638 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" Jan 21 15:42:55 crc kubenswrapper[4773]: I0121 15:42:55.206089 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:42:55 crc kubenswrapper[4773]: I0121 15:42:55.206639 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:42:59 crc kubenswrapper[4773]: I0121 15:42:59.089051 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-cn7gp" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.676462 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tjchm"] Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.679447 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.701047 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-xbqnl" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.701263 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tjchm"] Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.701341 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.701509 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.701711 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.764485 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-grw5k"] Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.767337 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.770598 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.780490 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-grw5k"] Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.832670 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckctl\" (UniqueName: \"kubernetes.io/projected/b422d823-b626-442e-8b51-2439c4046d71-kube-api-access-ckctl\") pod \"dnsmasq-dns-675f4bcbfc-tjchm\" (UID: \"b422d823-b626-442e-8b51-2439c4046d71\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.832801 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b422d823-b626-442e-8b51-2439c4046d71-config\") pod \"dnsmasq-dns-675f4bcbfc-tjchm\" (UID: \"b422d823-b626-442e-8b51-2439c4046d71\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.934832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkbmm\" (UniqueName: \"kubernetes.io/projected/f0aa7494-cbc8-4bca-bf38-33ef045d689b-kube-api-access-kkbmm\") pod \"dnsmasq-dns-78dd6ddcc-grw5k\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.934892 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-grw5k\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.934920 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-config\") pod \"dnsmasq-dns-78dd6ddcc-grw5k\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.935081 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckctl\" (UniqueName: \"kubernetes.io/projected/b422d823-b626-442e-8b51-2439c4046d71-kube-api-access-ckctl\") pod \"dnsmasq-dns-675f4bcbfc-tjchm\" (UID: \"b422d823-b626-442e-8b51-2439c4046d71\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.935133 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b422d823-b626-442e-8b51-2439c4046d71-config\") pod \"dnsmasq-dns-675f4bcbfc-tjchm\" (UID: \"b422d823-b626-442e-8b51-2439c4046d71\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.936170 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b422d823-b626-442e-8b51-2439c4046d71-config\") pod \"dnsmasq-dns-675f4bcbfc-tjchm\" (UID: \"b422d823-b626-442e-8b51-2439c4046d71\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" Jan 21 15:43:16 crc kubenswrapper[4773]: I0121 15:43:16.954913 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckctl\" (UniqueName: \"kubernetes.io/projected/b422d823-b626-442e-8b51-2439c4046d71-kube-api-access-ckctl\") pod \"dnsmasq-dns-675f4bcbfc-tjchm\" (UID: \"b422d823-b626-442e-8b51-2439c4046d71\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.011454 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.037481 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkbmm\" (UniqueName: \"kubernetes.io/projected/f0aa7494-cbc8-4bca-bf38-33ef045d689b-kube-api-access-kkbmm\") pod \"dnsmasq-dns-78dd6ddcc-grw5k\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.037528 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-grw5k\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.037558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-config\") pod \"dnsmasq-dns-78dd6ddcc-grw5k\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.038571 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-config\") pod \"dnsmasq-dns-78dd6ddcc-grw5k\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.039483 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-grw5k\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.064311 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkbmm\" (UniqueName: \"kubernetes.io/projected/f0aa7494-cbc8-4bca-bf38-33ef045d689b-kube-api-access-kkbmm\") pod \"dnsmasq-dns-78dd6ddcc-grw5k\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.086116 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.430955 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-grw5k"] Jan 21 15:43:17 crc kubenswrapper[4773]: W0121 15:43:17.438925 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0aa7494_cbc8_4bca_bf38_33ef045d689b.slice/crio-11d02b3d9b9a2c7b95e9b3e6c16e54a4abacd6b2197bb1fc5c407854d5e9e42c WatchSource:0}: Error finding container 11d02b3d9b9a2c7b95e9b3e6c16e54a4abacd6b2197bb1fc5c407854d5e9e42c: Status 404 returned error can't find the container with id 11d02b3d9b9a2c7b95e9b3e6c16e54a4abacd6b2197bb1fc5c407854d5e9e42c Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.494087 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tjchm"] Jan 21 15:43:17 crc kubenswrapper[4773]: W0121 15:43:17.499513 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb422d823_b626_442e_8b51_2439c4046d71.slice/crio-af7e49fbad01207761c158ff9313e87ec653b3d20f8652b7f520aa847feb8e90 WatchSource:0}: Error finding container af7e49fbad01207761c158ff9313e87ec653b3d20f8652b7f520aa847feb8e90: Status 404 returned error can't find the container with id af7e49fbad01207761c158ff9313e87ec653b3d20f8652b7f520aa847feb8e90 Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.518950 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" event={"ID":"b422d823-b626-442e-8b51-2439c4046d71","Type":"ContainerStarted","Data":"af7e49fbad01207761c158ff9313e87ec653b3d20f8652b7f520aa847feb8e90"} Jan 21 15:43:17 crc kubenswrapper[4773]: I0121 15:43:17.519997 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" event={"ID":"f0aa7494-cbc8-4bca-bf38-33ef045d689b","Type":"ContainerStarted","Data":"11d02b3d9b9a2c7b95e9b3e6c16e54a4abacd6b2197bb1fc5c407854d5e9e42c"} Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.440756 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tjchm"] Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.485305 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mldv7"] Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.486545 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.502781 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mldv7"] Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.589833 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mldv7\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.590162 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-config\") pod \"dnsmasq-dns-666b6646f7-mldv7\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.590308 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrpmc\" (UniqueName: \"kubernetes.io/projected/e7edb06e-2b04-4d03-b089-4724a466d720-kube-api-access-zrpmc\") pod \"dnsmasq-dns-666b6646f7-mldv7\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.696198 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-config\") pod \"dnsmasq-dns-666b6646f7-mldv7\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.696263 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrpmc\" (UniqueName: \"kubernetes.io/projected/e7edb06e-2b04-4d03-b089-4724a466d720-kube-api-access-zrpmc\") pod \"dnsmasq-dns-666b6646f7-mldv7\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.696353 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mldv7\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.699220 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-config\") pod \"dnsmasq-dns-666b6646f7-mldv7\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.720001 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mldv7\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.733003 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrpmc\" (UniqueName: \"kubernetes.io/projected/e7edb06e-2b04-4d03-b089-4724a466d720-kube-api-access-zrpmc\") pod \"dnsmasq-dns-666b6646f7-mldv7\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.756413 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-grw5k"] Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.794164 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4hbk9"] Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.796146 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.803027 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4hbk9"] Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.810107 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.899599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjhhd\" (UniqueName: \"kubernetes.io/projected/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-kube-api-access-hjhhd\") pod \"dnsmasq-dns-57d769cc4f-4hbk9\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.899704 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-config\") pod \"dnsmasq-dns-57d769cc4f-4hbk9\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:43:19 crc kubenswrapper[4773]: I0121 15:43:19.899736 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4hbk9\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.000206 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjhhd\" (UniqueName: \"kubernetes.io/projected/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-kube-api-access-hjhhd\") pod \"dnsmasq-dns-57d769cc4f-4hbk9\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.000289 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-config\") pod \"dnsmasq-dns-57d769cc4f-4hbk9\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.000313 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4hbk9\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.001546 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4hbk9\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.002039 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-config\") pod \"dnsmasq-dns-57d769cc4f-4hbk9\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.018144 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjhhd\" (UniqueName: \"kubernetes.io/projected/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-kube-api-access-hjhhd\") pod \"dnsmasq-dns-57d769cc4f-4hbk9\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.126137 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.357062 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mldv7"] Jan 21 15:43:20 crc kubenswrapper[4773]: W0121 15:43:20.377722 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7edb06e_2b04_4d03_b089_4724a466d720.slice/crio-8baee2408b6ee9636d6079a2a62c5605cd814aa8dc77d8e0a030ca1dc80c18a0 WatchSource:0}: Error finding container 8baee2408b6ee9636d6079a2a62c5605cd814aa8dc77d8e0a030ca1dc80c18a0: Status 404 returned error can't find the container with id 8baee2408b6ee9636d6079a2a62c5605cd814aa8dc77d8e0a030ca1dc80c18a0 Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.549155 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" event={"ID":"e7edb06e-2b04-4d03-b089-4724a466d720","Type":"ContainerStarted","Data":"8baee2408b6ee9636d6079a2a62c5605cd814aa8dc77d8e0a030ca1dc80c18a0"} Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.608575 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.609894 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.618140 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lxb5s" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.618800 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.619298 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.620576 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.620624 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.620715 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.619126 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.634980 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.646875 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4hbk9"] Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.712874 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.712924 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.712950 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.712987 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.713092 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.713154 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.713241 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djkvd\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-kube-api-access-djkvd\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.713314 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.713362 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.715102 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.715123 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-config-data\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.816852 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.816913 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.816937 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-config-data\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.817055 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.817079 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.817102 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.817132 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.818044 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.818079 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djkvd\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-kube-api-access-djkvd\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.818093 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.818104 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.818389 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.818542 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.818626 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-config-data\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.818730 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.819748 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.822402 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.822450 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/84c1d087d48810d54c9111c49096770b7a04a8eec4ea7f577cb5bfe0d8b6f672/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.824352 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.825010 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.825400 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.836448 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.853464 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djkvd\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-kube-api-access-djkvd\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.865541 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") pod \"rabbitmq-server-0\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " pod="openstack/rabbitmq-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.935363 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.937217 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.941962 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.942353 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.943296 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.943829 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.944550 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.945321 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f7cnv" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.945792 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.946200 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 21 15:43:20 crc kubenswrapper[4773]: I0121 15:43:20.957408 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.026491 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.026548 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgzbf\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-kube-api-access-vgzbf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.026567 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1849053d-528d-42bf-93f3-31cb3ef1c91e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.026588 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.026629 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.026653 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.026675 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.026719 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.026740 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.026754 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1849053d-528d-42bf-93f3-31cb3ef1c91e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.026788 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.128132 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.128455 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgzbf\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-kube-api-access-vgzbf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.128476 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1849053d-528d-42bf-93f3-31cb3ef1c91e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.128516 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.128555 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.128573 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.128593 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.128621 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.128644 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.128659 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1849053d-528d-42bf-93f3-31cb3ef1c91e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.128711 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.129588 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.130056 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.130354 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.130868 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.131062 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.141580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.146932 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.150080 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1849053d-528d-42bf-93f3-31cb3ef1c91e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.157486 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1849053d-528d-42bf-93f3-31cb3ef1c91e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.173742 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgzbf\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-kube-api-access-vgzbf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.221307 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.221355 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/979200276d874b4c9cbd62a4a1053c8813c6c43fc0020f0aeefb9b7d36abaec8/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.298937 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.451103 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 15:43:21 crc kubenswrapper[4773]: W0121 15:43:21.458200 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5b1166d_2f9b_452c_a0b2_e7f21998ff45.slice/crio-4e7bf9a720ae0056995d1bf8e160e926229c553ed6c30a9a3f731154ba8dbbf1 WatchSource:0}: Error finding container 4e7bf9a720ae0056995d1bf8e160e926229c553ed6c30a9a3f731154ba8dbbf1: Status 404 returned error can't find the container with id 4e7bf9a720ae0056995d1bf8e160e926229c553ed6c30a9a3f731154ba8dbbf1 Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.569502 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" event={"ID":"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff","Type":"ContainerStarted","Data":"1ba7156543f7e270f97806dd19cf4e66916aaad3364c5ff0461185b2264aae3e"} Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.570714 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5b1166d-2f9b-452c-a0b2-e7f21998ff45","Type":"ContainerStarted","Data":"4e7bf9a720ae0056995d1bf8e160e926229c553ed6c30a9a3f731154ba8dbbf1"} Jan 21 15:43:21 crc kubenswrapper[4773]: I0121 15:43:21.597151 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.080571 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.273760 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.275142 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.284998 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4ph9w" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.285095 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.285247 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.285330 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.299405 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.315425 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.354919 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6bdb8f33-a7b6-4697-8db0-77b5c65fd5ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6bdb8f33-a7b6-4697-8db0-77b5c65fd5ec\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.354984 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32888aa3-cb52-484f-9745-5d5dfc5179df-kolla-config\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.355022 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/32888aa3-cb52-484f-9745-5d5dfc5179df-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.355119 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32888aa3-cb52-484f-9745-5d5dfc5179df-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.355144 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/32888aa3-cb52-484f-9745-5d5dfc5179df-config-data-generated\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.355186 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32888aa3-cb52-484f-9745-5d5dfc5179df-operator-scripts\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.355209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qvrx\" (UniqueName: \"kubernetes.io/projected/32888aa3-cb52-484f-9745-5d5dfc5179df-kube-api-access-4qvrx\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.355243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/32888aa3-cb52-484f-9745-5d5dfc5179df-config-data-default\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.456828 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32888aa3-cb52-484f-9745-5d5dfc5179df-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.456877 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/32888aa3-cb52-484f-9745-5d5dfc5179df-config-data-generated\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.456912 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32888aa3-cb52-484f-9745-5d5dfc5179df-operator-scripts\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.456929 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qvrx\" (UniqueName: \"kubernetes.io/projected/32888aa3-cb52-484f-9745-5d5dfc5179df-kube-api-access-4qvrx\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.456951 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/32888aa3-cb52-484f-9745-5d5dfc5179df-config-data-default\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.456977 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6bdb8f33-a7b6-4697-8db0-77b5c65fd5ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6bdb8f33-a7b6-4697-8db0-77b5c65fd5ec\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.457001 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32888aa3-cb52-484f-9745-5d5dfc5179df-kolla-config\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.457022 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/32888aa3-cb52-484f-9745-5d5dfc5179df-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.462473 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/32888aa3-cb52-484f-9745-5d5dfc5179df-config-data-generated\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.463362 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32888aa3-cb52-484f-9745-5d5dfc5179df-kolla-config\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.463644 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/32888aa3-cb52-484f-9745-5d5dfc5179df-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.465398 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/32888aa3-cb52-484f-9745-5d5dfc5179df-config-data-default\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.469114 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32888aa3-cb52-484f-9745-5d5dfc5179df-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.473437 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32888aa3-cb52-484f-9745-5d5dfc5179df-operator-scripts\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.475117 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.475141 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6bdb8f33-a7b6-4697-8db0-77b5c65fd5ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6bdb8f33-a7b6-4697-8db0-77b5c65fd5ec\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/14d2fd25a749e7e154dca51091dc90e8b086a31b9f64c42a48152253c4a7749a/globalmount\"" pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.480382 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qvrx\" (UniqueName: \"kubernetes.io/projected/32888aa3-cb52-484f-9745-5d5dfc5179df-kube-api-access-4qvrx\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.539772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6bdb8f33-a7b6-4697-8db0-77b5c65fd5ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6bdb8f33-a7b6-4697-8db0-77b5c65fd5ec\") pod \"openstack-galera-0\" (UID: \"32888aa3-cb52-484f-9745-5d5dfc5179df\") " pod="openstack/openstack-galera-0" Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.600170 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1849053d-528d-42bf-93f3-31cb3ef1c91e","Type":"ContainerStarted","Data":"e3acd80e2d389ae53d3d78bdec15097675f1406fef4a69507d88c3989e00585e"} Jan 21 15:43:22 crc kubenswrapper[4773]: I0121 15:43:22.610821 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.120432 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 21 15:43:23 crc kubenswrapper[4773]: W0121 15:43:23.131888 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32888aa3_cb52_484f_9745_5d5dfc5179df.slice/crio-72e3e357016021603f37bc8013157a244decd7feb1253594d08e359622dea192 WatchSource:0}: Error finding container 72e3e357016021603f37bc8013157a244decd7feb1253594d08e359622dea192: Status 404 returned error can't find the container with id 72e3e357016021603f37bc8013157a244decd7feb1253594d08e359622dea192 Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.535222 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.536687 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.543794 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.544419 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8mgq4" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.544617 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.544829 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.584478 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.627329 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32888aa3-cb52-484f-9745-5d5dfc5179df","Type":"ContainerStarted","Data":"72e3e357016021603f37bc8013157a244decd7feb1253594d08e359622dea192"} Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.684859 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-75123828-7dc3-4108-9122-11aa15e1ed28\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75123828-7dc3-4108-9122-11aa15e1ed28\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.684903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/869ad9c0-3593-4ebc-9b58-7b9615e46927-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.684924 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/869ad9c0-3593-4ebc-9b58-7b9615e46927-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.684956 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869ad9c0-3593-4ebc-9b58-7b9615e46927-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.684982 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/869ad9c0-3593-4ebc-9b58-7b9615e46927-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.685040 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/869ad9c0-3593-4ebc-9b58-7b9615e46927-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.685063 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6jhk\" (UniqueName: \"kubernetes.io/projected/869ad9c0-3593-4ebc-9b58-7b9615e46927-kube-api-access-j6jhk\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.685093 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/869ad9c0-3593-4ebc-9b58-7b9615e46927-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.786171 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869ad9c0-3593-4ebc-9b58-7b9615e46927-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.786219 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/869ad9c0-3593-4ebc-9b58-7b9615e46927-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.786275 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/869ad9c0-3593-4ebc-9b58-7b9615e46927-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.786305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6jhk\" (UniqueName: \"kubernetes.io/projected/869ad9c0-3593-4ebc-9b58-7b9615e46927-kube-api-access-j6jhk\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.786337 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/869ad9c0-3593-4ebc-9b58-7b9615e46927-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.786378 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-75123828-7dc3-4108-9122-11aa15e1ed28\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75123828-7dc3-4108-9122-11aa15e1ed28\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.786393 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/869ad9c0-3593-4ebc-9b58-7b9615e46927-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.786409 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/869ad9c0-3593-4ebc-9b58-7b9615e46927-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.787641 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.787903 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869ad9c0-3593-4ebc-9b58-7b9615e46927-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.787948 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/869ad9c0-3593-4ebc-9b58-7b9615e46927-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.788622 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.788776 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/869ad9c0-3593-4ebc-9b58-7b9615e46927-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.790548 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/869ad9c0-3593-4ebc-9b58-7b9615e46927-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.796389 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-m2jk4" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.796497 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/869ad9c0-3593-4ebc-9b58-7b9615e46927-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.796841 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.796839 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/869ad9c0-3593-4ebc-9b58-7b9615e46927-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.797467 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.811529 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.830590 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6jhk\" (UniqueName: \"kubernetes.io/projected/869ad9c0-3593-4ebc-9b58-7b9615e46927-kube-api-access-j6jhk\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.889068 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb62c429-ac2b-4654-84e2-c92b0508eba4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.889135 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb62c429-ac2b-4654-84e2-c92b0508eba4-config-data\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.889183 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb62c429-ac2b-4654-84e2-c92b0508eba4-kolla-config\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.889207 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb62c429-ac2b-4654-84e2-c92b0508eba4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.889329 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgcd7\" (UniqueName: \"kubernetes.io/projected/eb62c429-ac2b-4654-84e2-c92b0508eba4-kube-api-access-hgcd7\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.921738 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.921786 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-75123828-7dc3-4108-9122-11aa15e1ed28\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75123828-7dc3-4108-9122-11aa15e1ed28\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/289d70ef73fdd2cd04fe322d12780ad553ce2cd2ff891235c78fb5975cdf2845/globalmount\"" pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.992829 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb62c429-ac2b-4654-84e2-c92b0508eba4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.993152 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb62c429-ac2b-4654-84e2-c92b0508eba4-config-data\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.993209 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb62c429-ac2b-4654-84e2-c92b0508eba4-kolla-config\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.993232 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb62c429-ac2b-4654-84e2-c92b0508eba4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:23 crc kubenswrapper[4773]: I0121 15:43:23.993281 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgcd7\" (UniqueName: \"kubernetes.io/projected/eb62c429-ac2b-4654-84e2-c92b0508eba4-kube-api-access-hgcd7\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:24 crc kubenswrapper[4773]: I0121 15:43:24.000276 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb62c429-ac2b-4654-84e2-c92b0508eba4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:24 crc kubenswrapper[4773]: I0121 15:43:24.001997 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb62c429-ac2b-4654-84e2-c92b0508eba4-config-data\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:24 crc kubenswrapper[4773]: I0121 15:43:24.002664 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb62c429-ac2b-4654-84e2-c92b0508eba4-kolla-config\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:24 crc kubenswrapper[4773]: I0121 15:43:24.003407 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-75123828-7dc3-4108-9122-11aa15e1ed28\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75123828-7dc3-4108-9122-11aa15e1ed28\") pod \"openstack-cell1-galera-0\" (UID: \"869ad9c0-3593-4ebc-9b58-7b9615e46927\") " pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:24 crc kubenswrapper[4773]: I0121 15:43:24.027810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb62c429-ac2b-4654-84e2-c92b0508eba4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:24 crc kubenswrapper[4773]: I0121 15:43:24.050293 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgcd7\" (UniqueName: \"kubernetes.io/projected/eb62c429-ac2b-4654-84e2-c92b0508eba4-kube-api-access-hgcd7\") pod \"memcached-0\" (UID: \"eb62c429-ac2b-4654-84e2-c92b0508eba4\") " pod="openstack/memcached-0" Jan 21 15:43:24 crc kubenswrapper[4773]: I0121 15:43:24.191054 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 21 15:43:24 crc kubenswrapper[4773]: I0121 15:43:24.204719 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 21 15:43:24 crc kubenswrapper[4773]: I0121 15:43:24.889287 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 21 15:43:25 crc kubenswrapper[4773]: I0121 15:43:25.020226 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 15:43:25 crc kubenswrapper[4773]: I0121 15:43:25.206395 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:43:25 crc kubenswrapper[4773]: I0121 15:43:25.206446 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:43:25 crc kubenswrapper[4773]: I0121 15:43:25.669843 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"869ad9c0-3593-4ebc-9b58-7b9615e46927","Type":"ContainerStarted","Data":"ce6ee5f87eac3151bb9e5072ed01d36440c6210228f7fc511b0eddd0076fc3d7"} Jan 21 15:43:25 crc kubenswrapper[4773]: I0121 15:43:25.678338 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"eb62c429-ac2b-4654-84e2-c92b0508eba4","Type":"ContainerStarted","Data":"081448b938620ae76524197afc8f67429d60344ff07aa54d600381e6e47c39fd"} Jan 21 15:43:25 crc kubenswrapper[4773]: I0121 15:43:25.845928 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 15:43:25 crc kubenswrapper[4773]: I0121 15:43:25.847014 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 15:43:25 crc kubenswrapper[4773]: I0121 15:43:25.849983 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-n484g" Jan 21 15:43:25 crc kubenswrapper[4773]: I0121 15:43:25.858588 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 15:43:25 crc kubenswrapper[4773]: I0121 15:43:25.944534 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rql5b\" (UniqueName: \"kubernetes.io/projected/d7552a02-8d95-4fce-b6b0-7bbac761ad35-kube-api-access-rql5b\") pod \"kube-state-metrics-0\" (UID: \"d7552a02-8d95-4fce-b6b0-7bbac761ad35\") " pod="openstack/kube-state-metrics-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.049321 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rql5b\" (UniqueName: \"kubernetes.io/projected/d7552a02-8d95-4fce-b6b0-7bbac761ad35-kube-api-access-rql5b\") pod \"kube-state-metrics-0\" (UID: \"d7552a02-8d95-4fce-b6b0-7bbac761ad35\") " pod="openstack/kube-state-metrics-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.079812 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rql5b\" (UniqueName: \"kubernetes.io/projected/d7552a02-8d95-4fce-b6b0-7bbac761ad35-kube-api-access-rql5b\") pod \"kube-state-metrics-0\" (UID: \"d7552a02-8d95-4fce-b6b0-7bbac761ad35\") " pod="openstack/kube-state-metrics-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.192463 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.739754 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.741615 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.749860 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.750399 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.750930 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.759128 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-2lfkz" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.761425 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.762350 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.863179 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8f6a4485-3dde-469d-98ee-026edcc3eb76-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.863229 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f6a4485-3dde-469d-98ee-026edcc3eb76-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.863279 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/8f6a4485-3dde-469d-98ee-026edcc3eb76-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.863423 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m4ht\" (UniqueName: \"kubernetes.io/projected/8f6a4485-3dde-469d-98ee-026edcc3eb76-kube-api-access-5m4ht\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.863471 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8f6a4485-3dde-469d-98ee-026edcc3eb76-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.863511 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f6a4485-3dde-469d-98ee-026edcc3eb76-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.863630 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f6a4485-3dde-469d-98ee-026edcc3eb76-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.964962 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f6a4485-3dde-469d-98ee-026edcc3eb76-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.965041 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f6a4485-3dde-469d-98ee-026edcc3eb76-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.965099 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8f6a4485-3dde-469d-98ee-026edcc3eb76-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.965116 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f6a4485-3dde-469d-98ee-026edcc3eb76-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.965141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/8f6a4485-3dde-469d-98ee-026edcc3eb76-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.965179 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m4ht\" (UniqueName: \"kubernetes.io/projected/8f6a4485-3dde-469d-98ee-026edcc3eb76-kube-api-access-5m4ht\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.965197 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8f6a4485-3dde-469d-98ee-026edcc3eb76-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.983777 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/8f6a4485-3dde-469d-98ee-026edcc3eb76-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:26 crc kubenswrapper[4773]: I0121 15:43:26.987642 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8f6a4485-3dde-469d-98ee-026edcc3eb76-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:26.987773 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8f6a4485-3dde-469d-98ee-026edcc3eb76-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.000091 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f6a4485-3dde-469d-98ee-026edcc3eb76-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:26.987998 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f6a4485-3dde-469d-98ee-026edcc3eb76-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:26.990475 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f6a4485-3dde-469d-98ee-026edcc3eb76-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.028655 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m4ht\" (UniqueName: \"kubernetes.io/projected/8f6a4485-3dde-469d-98ee-026edcc3eb76-kube-api-access-5m4ht\") pod \"alertmanager-metric-storage-0\" (UID: \"8f6a4485-3dde-469d-98ee-026edcc3eb76\") " pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.085255 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.139677 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 15:43:27 crc kubenswrapper[4773]: W0121 15:43:27.186705 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7552a02_8d95_4fce_b6b0_7bbac761ad35.slice/crio-d415121dfbc3351ecc78b0b2eb2e655032eded61af7aa1eac930bfda353d68e4 WatchSource:0}: Error finding container d415121dfbc3351ecc78b0b2eb2e655032eded61af7aa1eac930bfda353d68e4: Status 404 returned error can't find the container with id d415121dfbc3351ecc78b0b2eb2e655032eded61af7aa1eac930bfda353d68e4 Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.203933 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.208019 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.209714 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.209714 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.211902 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.212296 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.213149 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.213561 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.213582 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-tgvvj" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.213682 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.222041 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.271985 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.272038 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.272100 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwjm\" (UniqueName: \"kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-kube-api-access-6hwjm\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.272138 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.272178 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.272224 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-548c9d94-c024-405f-acd8-2291e57bfa65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.272277 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.272315 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.272335 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.272446 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.373956 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.374027 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.374061 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.374086 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.374116 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.374139 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.374188 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwjm\" (UniqueName: \"kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-kube-api-access-6hwjm\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.374220 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.374252 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.374286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-548c9d94-c024-405f-acd8-2291e57bfa65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.375573 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.375885 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.377068 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.380309 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.380454 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.380752 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.380786 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-548c9d94-c024-405f-acd8-2291e57bfa65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7e54ebd5078dca103e79dcb1052666590d0c6b422a63be683d9d63cec387cb90/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.382121 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.395163 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.396544 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.397769 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwjm\" (UniqueName: \"kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-kube-api-access-6hwjm\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.447926 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-548c9d94-c024-405f-acd8-2291e57bfa65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65\") pod \"prometheus-metric-storage-0\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.598252 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.719360 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7552a02-8d95-4fce-b6b0-7bbac761ad35","Type":"ContainerStarted","Data":"d415121dfbc3351ecc78b0b2eb2e655032eded61af7aa1eac930bfda353d68e4"} Jan 21 15:43:27 crc kubenswrapper[4773]: I0121 15:43:27.820305 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 21 15:43:27 crc kubenswrapper[4773]: W0121 15:43:27.851135 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6a4485_3dde_469d_98ee_026edcc3eb76.slice/crio-eb65f2156edcb2f8f617e8fd64a81a48007d54b8ba40fc73b0595fd27db2afdb WatchSource:0}: Error finding container eb65f2156edcb2f8f617e8fd64a81a48007d54b8ba40fc73b0595fd27db2afdb: Status 404 returned error can't find the container with id eb65f2156edcb2f8f617e8fd64a81a48007d54b8ba40fc73b0595fd27db2afdb Jan 21 15:43:28 crc kubenswrapper[4773]: I0121 15:43:28.175198 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 21 15:43:28 crc kubenswrapper[4773]: I0121 15:43:28.734163 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8f6a4485-3dde-469d-98ee-026edcc3eb76","Type":"ContainerStarted","Data":"eb65f2156edcb2f8f617e8fd64a81a48007d54b8ba40fc73b0595fd27db2afdb"} Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.631626 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zwrfs"] Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.633812 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.644731 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zwrfs"] Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.661419 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-x9hf4" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.664962 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.665377 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.690423 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-mvwkv"] Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.696728 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.727977 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f582857-cae4-4fa2-896d-b763b224ad8e-var-run\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.728069 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f582857-cae4-4fa2-896d-b763b224ad8e-scripts\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.728092 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f582857-cae4-4fa2-896d-b763b224ad8e-ovn-controller-tls-certs\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.728405 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88pw2\" (UniqueName: \"kubernetes.io/projected/1f582857-cae4-4fa2-896d-b763b224ad8e-kube-api-access-88pw2\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.728434 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f582857-cae4-4fa2-896d-b763b224ad8e-var-log-ovn\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.728498 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f582857-cae4-4fa2-896d-b763b224ad8e-var-run-ovn\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.728553 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f582857-cae4-4fa2-896d-b763b224ad8e-combined-ca-bundle\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.775222 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mvwkv"] Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.832510 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f582857-cae4-4fa2-896d-b763b224ad8e-var-run-ovn\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.832639 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f582857-cae4-4fa2-896d-b763b224ad8e-var-run-ovn\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.832803 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f582857-cae4-4fa2-896d-b763b224ad8e-combined-ca-bundle\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.832835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f582857-cae4-4fa2-896d-b763b224ad8e-var-run\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.832882 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7c6c\" (UniqueName: \"kubernetes.io/projected/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-kube-api-access-s7c6c\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.833152 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f582857-cae4-4fa2-896d-b763b224ad8e-var-run\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.834119 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-var-run\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.834199 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f582857-cae4-4fa2-896d-b763b224ad8e-scripts\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.834239 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f582857-cae4-4fa2-896d-b763b224ad8e-ovn-controller-tls-certs\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.834289 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-var-log\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.834336 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-var-lib\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.834585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-etc-ovs\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.834758 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-scripts\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.834830 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88pw2\" (UniqueName: \"kubernetes.io/projected/1f582857-cae4-4fa2-896d-b763b224ad8e-kube-api-access-88pw2\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.834872 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f582857-cae4-4fa2-896d-b763b224ad8e-var-log-ovn\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.835447 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f582857-cae4-4fa2-896d-b763b224ad8e-var-log-ovn\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.836484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f582857-cae4-4fa2-896d-b763b224ad8e-scripts\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.840935 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f582857-cae4-4fa2-896d-b763b224ad8e-combined-ca-bundle\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.846787 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f582857-cae4-4fa2-896d-b763b224ad8e-ovn-controller-tls-certs\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.866745 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88pw2\" (UniqueName: \"kubernetes.io/projected/1f582857-cae4-4fa2-896d-b763b224ad8e-kube-api-access-88pw2\") pod \"ovn-controller-zwrfs\" (UID: \"1f582857-cae4-4fa2-896d-b763b224ad8e\") " pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.959614 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-var-run\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.959729 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-var-log\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.959769 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-var-lib\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.959829 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-var-run\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.959851 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-etc-ovs\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.960074 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-var-log\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.962053 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-var-lib\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.962062 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-etc-ovs\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.962060 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-scripts\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.962615 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7c6c\" (UniqueName: \"kubernetes.io/projected/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-kube-api-access-s7c6c\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.968355 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-scripts\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:29 crc kubenswrapper[4773]: I0121 15:43:29.982584 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7c6c\" (UniqueName: \"kubernetes.io/projected/b457bfe0-3f48-4e19-88a8-2b1ccefa549f-kube-api-access-s7c6c\") pod \"ovn-controller-ovs-mvwkv\" (UID: \"b457bfe0-3f48-4e19-88a8-2b1ccefa549f\") " pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.021040 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zwrfs" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.034139 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.160197 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.164000 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.168047 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.168262 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bq98s" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.168803 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.171124 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.171191 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.171330 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.269206 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/088efb4e-fd31-4648-88cf-ceac1edb1723-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.269540 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/088efb4e-fd31-4648-88cf-ceac1edb1723-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.269573 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rqdz\" (UniqueName: \"kubernetes.io/projected/088efb4e-fd31-4648-88cf-ceac1edb1723-kube-api-access-2rqdz\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.269599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-16c7a150-2222-4faa-a66a-0b86893d2bb3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c7a150-2222-4faa-a66a-0b86893d2bb3\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.269631 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/088efb4e-fd31-4648-88cf-ceac1edb1723-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.269754 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/088efb4e-fd31-4648-88cf-ceac1edb1723-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.269781 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/088efb4e-fd31-4648-88cf-ceac1edb1723-config\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.269836 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/088efb4e-fd31-4648-88cf-ceac1edb1723-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.373665 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/088efb4e-fd31-4648-88cf-ceac1edb1723-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.373772 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/088efb4e-fd31-4648-88cf-ceac1edb1723-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.373855 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/088efb4e-fd31-4648-88cf-ceac1edb1723-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.373904 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rqdz\" (UniqueName: \"kubernetes.io/projected/088efb4e-fd31-4648-88cf-ceac1edb1723-kube-api-access-2rqdz\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.373956 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-16c7a150-2222-4faa-a66a-0b86893d2bb3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c7a150-2222-4faa-a66a-0b86893d2bb3\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.374020 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/088efb4e-fd31-4648-88cf-ceac1edb1723-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.374053 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/088efb4e-fd31-4648-88cf-ceac1edb1723-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.374106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/088efb4e-fd31-4648-88cf-ceac1edb1723-config\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.374763 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/088efb4e-fd31-4648-88cf-ceac1edb1723-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.375671 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/088efb4e-fd31-4648-88cf-ceac1edb1723-config\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.375929 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/088efb4e-fd31-4648-88cf-ceac1edb1723-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.377184 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.377212 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-16c7a150-2222-4faa-a66a-0b86893d2bb3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c7a150-2222-4faa-a66a-0b86893d2bb3\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e68798fea3f761ab8ac383504ef9a2fe8bf079f553aeabd2187f6bdc1ae6f3c3/globalmount\"" pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.379025 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/088efb4e-fd31-4648-88cf-ceac1edb1723-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.379626 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/088efb4e-fd31-4648-88cf-ceac1edb1723-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.381738 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/088efb4e-fd31-4648-88cf-ceac1edb1723-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.414610 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rqdz\" (UniqueName: \"kubernetes.io/projected/088efb4e-fd31-4648-88cf-ceac1edb1723-kube-api-access-2rqdz\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.447666 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-16c7a150-2222-4faa-a66a-0b86893d2bb3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c7a150-2222-4faa-a66a-0b86893d2bb3\") pod \"ovsdbserver-nb-0\" (UID: \"088efb4e-fd31-4648-88cf-ceac1edb1723\") " pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:30 crc kubenswrapper[4773]: I0121 15:43:30.499104 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.094612 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.096500 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.098685 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.098870 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.098912 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-jvkts" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.099200 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.113752 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.266149 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4463c6b1-41db-4bdd-a867-89936873241c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4463c6b1-41db-4bdd-a867-89936873241c\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.266209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7656b240-dd39-4bd3-8cdb-2f5103f17656-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.266250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7656b240-dd39-4bd3-8cdb-2f5103f17656-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.266275 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7656b240-dd39-4bd3-8cdb-2f5103f17656-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.266313 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7656b240-dd39-4bd3-8cdb-2f5103f17656-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.266396 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7656b240-dd39-4bd3-8cdb-2f5103f17656-config\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.266436 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58gv8\" (UniqueName: \"kubernetes.io/projected/7656b240-dd39-4bd3-8cdb-2f5103f17656-kube-api-access-58gv8\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.266488 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7656b240-dd39-4bd3-8cdb-2f5103f17656-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.368208 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7656b240-dd39-4bd3-8cdb-2f5103f17656-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.368546 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7656b240-dd39-4bd3-8cdb-2f5103f17656-config\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.368681 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58gv8\" (UniqueName: \"kubernetes.io/projected/7656b240-dd39-4bd3-8cdb-2f5103f17656-kube-api-access-58gv8\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.368872 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7656b240-dd39-4bd3-8cdb-2f5103f17656-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.368999 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4463c6b1-41db-4bdd-a867-89936873241c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4463c6b1-41db-4bdd-a867-89936873241c\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.369124 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7656b240-dd39-4bd3-8cdb-2f5103f17656-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.369230 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7656b240-dd39-4bd3-8cdb-2f5103f17656-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.370315 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7656b240-dd39-4bd3-8cdb-2f5103f17656-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.370281 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7656b240-dd39-4bd3-8cdb-2f5103f17656-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.369347 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7656b240-dd39-4bd3-8cdb-2f5103f17656-config\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.370916 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7656b240-dd39-4bd3-8cdb-2f5103f17656-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.375480 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.375528 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4463c6b1-41db-4bdd-a867-89936873241c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4463c6b1-41db-4bdd-a867-89936873241c\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/884dc5f4ddfd244711eafa583c71fe6ba32af98926e46bb5df0e71a63d884440/globalmount\"" pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.377665 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7656b240-dd39-4bd3-8cdb-2f5103f17656-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.379526 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7656b240-dd39-4bd3-8cdb-2f5103f17656-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.383802 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7656b240-dd39-4bd3-8cdb-2f5103f17656-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.394181 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58gv8\" (UniqueName: \"kubernetes.io/projected/7656b240-dd39-4bd3-8cdb-2f5103f17656-kube-api-access-58gv8\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.412466 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4463c6b1-41db-4bdd-a867-89936873241c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4463c6b1-41db-4bdd-a867-89936873241c\") pod \"ovsdbserver-sb-0\" (UID: \"7656b240-dd39-4bd3-8cdb-2f5103f17656\") " pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:35 crc kubenswrapper[4773]: I0121 15:43:35.715294 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 21 15:43:38 crc kubenswrapper[4773]: I0121 15:43:38.762100 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7"] Jan 21 15:43:38 crc kubenswrapper[4773]: I0121 15:43:38.763509 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:38 crc kubenswrapper[4773]: I0121 15:43:38.765393 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Jan 21 15:43:38 crc kubenswrapper[4773]: I0121 15:43:38.765397 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-lk2mx" Jan 21 15:43:38 crc kubenswrapper[4773]: I0121 15:43:38.766164 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Jan 21 15:43:38 crc kubenswrapper[4773]: I0121 15:43:38.766787 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Jan 21 15:43:38 crc kubenswrapper[4773]: I0121 15:43:38.767041 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.786349 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.917000 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.918159 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.921524 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.921765 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.925015 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.934601 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.934676 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n68v5\" (UniqueName: \"kubernetes.io/projected/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-kube-api-access-n68v5\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.934832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-config\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.934861 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.934909 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:38.957024 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.014658 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.015774 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.018549 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.018715 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.038347 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcpqv\" (UniqueName: \"kubernetes.io/projected/b00c4a6e-de8c-43d7-a190-bd512a63f9de-kube-api-access-rcpqv\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.038392 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b00c4a6e-de8c-43d7-a190-bd512a63f9de-config\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.038446 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.038497 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-config\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.038525 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.038552 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.038581 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.038619 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.038659 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.038722 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.038754 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n68v5\" (UniqueName: \"kubernetes.io/projected/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-kube-api-access-n68v5\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.039736 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.040168 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-config\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.047464 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.049774 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.066273 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.075220 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n68v5\" (UniqueName: \"kubernetes.io/projected/e87ce1ac-65bc-4e61-a72a-381b6f7653f9-kube-api-access-n68v5\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-xkhx7\" (UID: \"e87ce1ac-65bc-4e61-a72a-381b6f7653f9\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.088127 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.140567 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.140664 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-config\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.140714 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcpqv\" (UniqueName: \"kubernetes.io/projected/b00c4a6e-de8c-43d7-a190-bd512a63f9de-kube-api-access-rcpqv\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.140739 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b00c4a6e-de8c-43d7-a190-bd512a63f9de-config\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.140771 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.140819 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.140849 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.140901 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.140924 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2r25\" (UniqueName: \"kubernetes.io/projected/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-kube-api-access-s2r25\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.140948 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.140978 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.146538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.148720 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b00c4a6e-de8c-43d7-a190-bd512a63f9de-config\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.153324 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.171407 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.180776 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcpqv\" (UniqueName: \"kubernetes.io/projected/b00c4a6e-de8c-43d7-a190-bd512a63f9de-kube-api-access-rcpqv\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.191367 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/b00c4a6e-de8c-43d7-a190-bd512a63f9de-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-9886v\" (UID: \"b00c4a6e-de8c-43d7-a190-bd512a63f9de\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.215742 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.216852 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.222414 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.222706 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.222804 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.225234 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-qhwbz" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.225399 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.225529 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.225728 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.244605 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-config\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.244664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.244724 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.244765 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2r25\" (UniqueName: \"kubernetes.io/projected/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-kube-api-access-s2r25\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.244785 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.250932 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.251398 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-config\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.256110 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.264826 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.287630 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.288133 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.293091 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2r25\" (UniqueName: \"kubernetes.io/projected/103932c1-cb6f-4a90-9d70-0dcc1787a5b7-kube-api-access-s2r25\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-phpsd\" (UID: \"103932c1-cb6f-4a90-9d70-0dcc1787a5b7\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.293144 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.300231 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.320249 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.337904 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.349801 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.349870 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.349908 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.349929 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.349951 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.350015 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.350039 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.350060 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb6kt\" (UniqueName: \"kubernetes.io/projected/3bd14b5f-6a17-41d2-bd18-522651121850-kube-api-access-gb6kt\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.350081 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451436 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb6kt\" (UniqueName: \"kubernetes.io/projected/3bd14b5f-6a17-41d2-bd18-522651121850-kube-api-access-gb6kt\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451476 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451525 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451552 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451572 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451601 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451645 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkbvr\" (UniqueName: \"kubernetes.io/projected/45eeec38-a51c-4228-8616-335dc3b951b7-kube-api-access-tkbvr\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451665 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451680 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451757 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451793 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451812 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451831 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451870 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451890 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451905 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.451921 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: E0121 15:43:39.452153 4773 configmap.go:193] Couldn't get configMap openstack/cloudkitty-lokistack-gateway-ca-bundle: configmap "cloudkitty-lokistack-gateway-ca-bundle" not found Jan 21 15:43:52 crc kubenswrapper[4773]: E0121 15:43:39.452239 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-gateway-ca-bundle podName:3bd14b5f-6a17-41d2-bd18-522651121850 nodeName:}" failed. No retries permitted until 2026-01-21 15:43:39.952219168 +0000 UTC m=+1184.876708790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloudkitty-lokistack-gateway-ca-bundle" (UniqueName: "kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-gateway-ca-bundle") pod "cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" (UID: "3bd14b5f-6a17-41d2-bd18-522651121850") : configmap "cloudkitty-lokistack-gateway-ca-bundle" not found Jan 21 15:43:52 crc kubenswrapper[4773]: E0121 15:43:39.452252 4773 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Jan 21 15:43:52 crc kubenswrapper[4773]: E0121 15:43:39.452300 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-tls-secret podName:3bd14b5f-6a17-41d2-bd18-522651121850 nodeName:}" failed. No retries permitted until 2026-01-21 15:43:39.95229089 +0000 UTC m=+1184.876780512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-tls-secret") pod "cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" (UID: "3bd14b5f-6a17-41d2-bd18-522651121850") : secret "cloudkitty-lokistack-gateway-http" not found Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.452858 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.453380 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.453402 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.459451 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.459504 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.474435 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb6kt\" (UniqueName: \"kubernetes.io/projected/3bd14b5f-6a17-41d2-bd18-522651121850-kube-api-access-gb6kt\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.553119 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.553195 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.553214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.553242 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkbvr\" (UniqueName: \"kubernetes.io/projected/45eeec38-a51c-4228-8616-335dc3b951b7-kube-api-access-tkbvr\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.553265 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.553282 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.553354 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.553372 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.553420 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: E0121 15:43:39.553912 4773 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Jan 21 15:43:52 crc kubenswrapper[4773]: E0121 15:43:39.553975 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-tls-secret podName:45eeec38-a51c-4228-8616-335dc3b951b7 nodeName:}" failed. No retries permitted until 2026-01-21 15:43:40.053957784 +0000 UTC m=+1184.978447406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-tls-secret") pod "cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" (UID: "45eeec38-a51c-4228-8616-335dc3b951b7") : secret "cloudkitty-lokistack-gateway-http" not found Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.554661 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.554853 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.554866 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.555169 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.555473 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.556461 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.557044 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.570858 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkbvr\" (UniqueName: \"kubernetes.io/projected/45eeec38-a51c-4228-8616-335dc3b951b7-kube-api-access-tkbvr\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.847911 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.910216 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.911558 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.921153 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.922477 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.922855 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.963714 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.963888 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.964652 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bd14b5f-6a17-41d2-bd18-522651121850-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.967402 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/3bd14b5f-6a17-41d2-bd18-522651121850-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr\" (UID: \"3bd14b5f-6a17-41d2-bd18-522651121850\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:39.997562 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.001455 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.004125 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.010825 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.015886 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.065978 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.066028 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.066075 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.066105 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.066196 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfvcp\" (UniqueName: \"kubernetes.io/projected/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-kube-api-access-vfvcp\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.066404 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.066476 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.066507 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.066539 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.069116 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/45eeec38-a51c-4228-8616-335dc3b951b7-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s\" (UID: \"45eeec38-a51c-4228-8616-335dc3b951b7\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.101267 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.102494 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.109790 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.110509 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.115913 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168118 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168159 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168206 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168253 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168281 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168314 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmcwc\" (UniqueName: \"kubernetes.io/projected/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-kube-api-access-bmcwc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168343 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168366 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168386 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168418 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168445 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168468 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168542 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168599 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168603 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168634 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168683 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168758 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168811 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44cd53f0-37e0-4b02-9922-49d99dfee92a-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168845 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqhj8\" (UniqueName: \"kubernetes.io/projected/44cd53f0-37e0-4b02-9922-49d99dfee92a-kube-api-access-jqhj8\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.168992 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.169061 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfvcp\" (UniqueName: \"kubernetes.io/projected/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-kube-api-access-vfvcp\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.169087 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.169364 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.170063 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.170103 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.172420 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.179451 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.186117 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.188345 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfvcp\" (UniqueName: \"kubernetes.io/projected/68c0e8c6-bc28-4101-a1d5-99ce639ae62c-kube-api-access-vfvcp\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.188590 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.191369 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"68c0e8c6-bc28-4101-a1d5-99ce639ae62c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.198159 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.224603 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270641 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270728 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmcwc\" (UniqueName: \"kubernetes.io/projected/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-kube-api-access-bmcwc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270755 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270774 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270814 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270838 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270857 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270885 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270902 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270925 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44cd53f0-37e0-4b02-9922-49d99dfee92a-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270945 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqhj8\" (UniqueName: \"kubernetes.io/projected/44cd53f0-37e0-4b02-9922-49d99dfee92a-kube-api-access-jqhj8\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.270978 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.271971 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.272383 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.272810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44cd53f0-37e0-4b02-9922-49d99dfee92a-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.272887 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.272902 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.273004 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.273298 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.277994 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.288037 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.288081 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.288454 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/44cd53f0-37e0-4b02-9922-49d99dfee92a-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.288635 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.289560 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.293191 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqhj8\" (UniqueName: \"kubernetes.io/projected/44cd53f0-37e0-4b02-9922-49d99dfee92a-kube-api-access-jqhj8\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.293602 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmcwc\" (UniqueName: \"kubernetes.io/projected/9e04988a-e98a-4c9d-9a51-e0e69a6810c9-kube-api-access-bmcwc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.296268 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"9e04988a-e98a-4c9d-9a51-e0e69a6810c9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.312949 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"44cd53f0-37e0-4b02-9922-49d99dfee92a\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.336306 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:43:52 crc kubenswrapper[4773]: I0121 15:43:40.431821 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:43:55 crc kubenswrapper[4773]: I0121 15:43:55.205877 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:43:55 crc kubenswrapper[4773]: I0121 15:43:55.206423 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:43:55 crc kubenswrapper[4773]: I0121 15:43:55.206465 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:43:55 crc kubenswrapper[4773]: I0121 15:43:55.207198 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0aa21e0cf3e6fccf1e5cd944cd86f7a3dd434dbe323f714f139c45999c5ca44"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:43:55 crc kubenswrapper[4773]: I0121 15:43:55.207256 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://d0aa21e0cf3e6fccf1e5cd944cd86f7a3dd434dbe323f714f139c45999c5ca44" gracePeriod=600 Jan 21 15:43:59 crc kubenswrapper[4773]: I0121 15:43:59.059054 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23","Type":"ContainerStarted","Data":"c3783e3f175efc423655266770ef42770d5e9dc7ced78825682381a82ff01b51"} Jan 21 15:43:59 crc kubenswrapper[4773]: I0121 15:43:59.063293 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="d0aa21e0cf3e6fccf1e5cd944cd86f7a3dd434dbe323f714f139c45999c5ca44" exitCode=0 Jan 21 15:43:59 crc kubenswrapper[4773]: I0121 15:43:59.063342 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"d0aa21e0cf3e6fccf1e5cd944cd86f7a3dd434dbe323f714f139c45999c5ca44"} Jan 21 15:43:59 crc kubenswrapper[4773]: I0121 15:43:59.063378 4773 scope.go:117] "RemoveContainer" containerID="ac9fadc09282233e8c4f18266ba6204c80ab33ee79a6058a1eff20ea540a3140" Jan 21 15:44:10 crc kubenswrapper[4773]: E0121 15:44:10.231895 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 21 15:44:10 crc kubenswrapper[4773]: E0121 15:44:10.232909 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4qvrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(32888aa3-cb52-484f-9745-5d5dfc5179df): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:44:10 crc kubenswrapper[4773]: E0121 15:44:10.234087 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="32888aa3-cb52-484f-9745-5d5dfc5179df" Jan 21 15:44:10 crc kubenswrapper[4773]: E0121 15:44:10.279755 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 21 15:44:10 crc kubenswrapper[4773]: E0121 15:44:10.279921 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j6jhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(869ad9c0-3593-4ebc-9b58-7b9615e46927): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:44:10 crc kubenswrapper[4773]: E0121 15:44:10.281969 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="869ad9c0-3593-4ebc-9b58-7b9615e46927" Jan 21 15:44:11 crc kubenswrapper[4773]: E0121 15:44:11.149266 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="869ad9c0-3593-4ebc-9b58-7b9615e46927" Jan 21 15:44:11 crc kubenswrapper[4773]: E0121 15:44:11.149304 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="32888aa3-cb52-484f-9745-5d5dfc5179df" Jan 21 15:44:11 crc kubenswrapper[4773]: E0121 15:44:11.537936 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 21 15:44:11 crc kubenswrapper[4773]: E0121 15:44:11.538152 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-djkvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(e5b1166d-2f9b-452c-a0b2-e7f21998ff45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:44:11 crc kubenswrapper[4773]: E0121 15:44:11.539404 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" Jan 21 15:44:12 crc kubenswrapper[4773]: E0121 15:44:12.154427 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" Jan 21 15:44:12 crc kubenswrapper[4773]: E0121 15:44:12.260824 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Jan 21 15:44:12 crc kubenswrapper[4773]: E0121 15:44:12.261180 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n5f7h9fh57h586h78h589hc8h98h64h679h646h68dh548h678h696h67hfchdchcch646h4h5f8h66bh86h9h5c6h564h587h575h6hf4h587q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hgcd7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(eb62c429-ac2b-4654-84e2-c92b0508eba4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:44:12 crc kubenswrapper[4773]: E0121 15:44:12.263063 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="eb62c429-ac2b-4654-84e2-c92b0508eba4" Jan 21 15:44:13 crc kubenswrapper[4773]: E0121 15:44:13.164928 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="eb62c429-ac2b-4654-84e2-c92b0508eba4" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.868315 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.869190 4773 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.869936 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rql5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(d7552a02-8d95-4fce-b6b0-7bbac761ad35): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" logger="UnhandledError" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.871187 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="d7552a02-8d95-4fce-b6b0-7bbac761ad35" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.897274 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.897434 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vgzbf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(1849053d-528d-42bf-93f3-31cb3ef1c91e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.898555 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="1849053d-528d-42bf-93f3-31cb3ef1c91e" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.917299 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.917560 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckctl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-tjchm_openstack(b422d823-b626-442e-8b51-2439c4046d71): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.918746 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" podUID="b422d823-b626-442e-8b51-2439c4046d71" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.934509 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.934722 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kkbmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-grw5k_openstack(f0aa7494-cbc8-4bca-bf38-33ef045d689b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.935990 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" podUID="f0aa7494-cbc8-4bca-bf38-33ef045d689b" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.957831 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.957978 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zrpmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-mldv7_openstack(e7edb06e-2b04-4d03-b089-4724a466d720): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.959213 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" podUID="e7edb06e-2b04-4d03-b089-4724a466d720" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.987964 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.988121 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hjhhd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-4hbk9_openstack(6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:44:14 crc kubenswrapper[4773]: E0121 15:44:14.989332 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" podUID="6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" Jan 21 15:44:15 crc kubenswrapper[4773]: E0121 15:44:15.181257 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" podUID="6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" Jan 21 15:44:15 crc kubenswrapper[4773]: E0121 15:44:15.181379 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="1849053d-528d-42bf-93f3-31cb3ef1c91e" Jan 21 15:44:15 crc kubenswrapper[4773]: E0121 15:44:15.181474 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="d7552a02-8d95-4fce-b6b0-7bbac761ad35" Jan 21 15:44:15 crc kubenswrapper[4773]: E0121 15:44:15.181582 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" podUID="e7edb06e-2b04-4d03-b089-4724a466d720" Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.802993 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.804855 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.925445 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckctl\" (UniqueName: \"kubernetes.io/projected/b422d823-b626-442e-8b51-2439c4046d71-kube-api-access-ckctl\") pod \"b422d823-b626-442e-8b51-2439c4046d71\" (UID: \"b422d823-b626-442e-8b51-2439c4046d71\") " Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.925524 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkbmm\" (UniqueName: \"kubernetes.io/projected/f0aa7494-cbc8-4bca-bf38-33ef045d689b-kube-api-access-kkbmm\") pod \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.925587 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b422d823-b626-442e-8b51-2439c4046d71-config\") pod \"b422d823-b626-442e-8b51-2439c4046d71\" (UID: \"b422d823-b626-442e-8b51-2439c4046d71\") " Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.925663 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-config\") pod \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.925723 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-dns-svc\") pod \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\" (UID: \"f0aa7494-cbc8-4bca-bf38-33ef045d689b\") " Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.927012 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0aa7494-cbc8-4bca-bf38-33ef045d689b" (UID: "f0aa7494-cbc8-4bca-bf38-33ef045d689b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.927682 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b422d823-b626-442e-8b51-2439c4046d71-config" (OuterVolumeSpecName: "config") pod "b422d823-b626-442e-8b51-2439c4046d71" (UID: "b422d823-b626-442e-8b51-2439c4046d71"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.928456 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-config" (OuterVolumeSpecName: "config") pod "f0aa7494-cbc8-4bca-bf38-33ef045d689b" (UID: "f0aa7494-cbc8-4bca-bf38-33ef045d689b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.942881 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0aa7494-cbc8-4bca-bf38-33ef045d689b-kube-api-access-kkbmm" (OuterVolumeSpecName: "kube-api-access-kkbmm") pod "f0aa7494-cbc8-4bca-bf38-33ef045d689b" (UID: "f0aa7494-cbc8-4bca-bf38-33ef045d689b"). InnerVolumeSpecName "kube-api-access-kkbmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:15 crc kubenswrapper[4773]: I0121 15:44:15.943199 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b422d823-b626-442e-8b51-2439c4046d71-kube-api-access-ckctl" (OuterVolumeSpecName: "kube-api-access-ckctl") pod "b422d823-b626-442e-8b51-2439c4046d71" (UID: "b422d823-b626-442e-8b51-2439c4046d71"). InnerVolumeSpecName "kube-api-access-ckctl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.033005 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckctl\" (UniqueName: \"kubernetes.io/projected/b422d823-b626-442e-8b51-2439c4046d71-kube-api-access-ckctl\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.033046 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkbmm\" (UniqueName: \"kubernetes.io/projected/f0aa7494-cbc8-4bca-bf38-33ef045d689b-kube-api-access-kkbmm\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.033063 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b422d823-b626-442e-8b51-2439c4046d71-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.033075 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.033089 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0aa7494-cbc8-4bca-bf38-33ef045d689b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.222222 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" event={"ID":"b422d823-b626-442e-8b51-2439c4046d71","Type":"ContainerDied","Data":"af7e49fbad01207761c158ff9313e87ec653b3d20f8652b7f520aa847feb8e90"} Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.224121 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" event={"ID":"f0aa7494-cbc8-4bca-bf38-33ef045d689b","Type":"ContainerDied","Data":"11d02b3d9b9a2c7b95e9b3e6c16e54a4abacd6b2197bb1fc5c407854d5e9e42c"} Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.223715 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-grw5k" Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.222502 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tjchm" Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.332812 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-grw5k"] Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.343525 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-grw5k"] Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.367780 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tjchm"] Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.376646 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tjchm"] Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.591972 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.614684 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd"] Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.706680 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.734991 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr"] Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.757733 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zwrfs"] Jan 21 15:44:16 crc kubenswrapper[4773]: W0121 15:44:16.762514 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f582857_cae4_4fa2_896d_b763b224ad8e.slice/crio-795bc3acadf5de449a41718e4311ca9dd0f4e48b5a31e10a69af1995df116ef7 WatchSource:0}: Error finding container 795bc3acadf5de449a41718e4311ca9dd0f4e48b5a31e10a69af1995df116ef7: Status 404 returned error can't find the container with id 795bc3acadf5de449a41718e4311ca9dd0f4e48b5a31e10a69af1995df116ef7 Jan 21 15:44:16 crc kubenswrapper[4773]: W0121 15:44:16.804900 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod088efb4e_fd31_4648_88cf_ceac1edb1723.slice/crio-93f90e2bf6d1c316a74621f10e888d26d0c5dda780d094ccc051f094b2e72b12 WatchSource:0}: Error finding container 93f90e2bf6d1c316a74621f10e888d26d0c5dda780d094ccc051f094b2e72b12: Status 404 returned error can't find the container with id 93f90e2bf6d1c316a74621f10e888d26d0c5dda780d094ccc051f094b2e72b12 Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.807644 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.858546 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 15:44:16 crc kubenswrapper[4773]: W0121 15:44:16.859763 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7656b240_dd39_4bd3_8cdb_2f5103f17656.slice/crio-dcd110a4d6ab4f24edf6ee6f2a09924050b8bc97996d9a43ee43ac7d8d31a4aa WatchSource:0}: Error finding container dcd110a4d6ab4f24edf6ee6f2a09924050b8bc97996d9a43ee43ac7d8d31a4aa: Status 404 returned error can't find the container with id dcd110a4d6ab4f24edf6ee6f2a09924050b8bc97996d9a43ee43ac7d8d31a4aa Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.927200 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Jan 21 15:44:16 crc kubenswrapper[4773]: W0121 15:44:16.929308 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68c0e8c6_bc28_4101_a1d5_99ce639ae62c.slice/crio-4f9aeb8155406c8f415b539ff564df5a471b1606964f1cee496d1654dc471e60 WatchSource:0}: Error finding container 4f9aeb8155406c8f415b539ff564df5a471b1606964f1cee496d1654dc471e60: Status 404 returned error can't find the container with id 4f9aeb8155406c8f415b539ff564df5a471b1606964f1cee496d1654dc471e60 Jan 21 15:44:16 crc kubenswrapper[4773]: W0121 15:44:16.937348 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45eeec38_a51c_4228_8616_335dc3b951b7.slice/crio-dd593db5a9557ec1c20f9bf4383b77a084f3b53c448e6d1db3ed74e1f13654a4 WatchSource:0}: Error finding container dd593db5a9557ec1c20f9bf4383b77a084f3b53c448e6d1db3ed74e1f13654a4: Status 404 returned error can't find the container with id dd593db5a9557ec1c20f9bf4383b77a084f3b53c448e6d1db3ed74e1f13654a4 Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.937896 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s"] Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.952105 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v"] Jan 21 15:44:16 crc kubenswrapper[4773]: W0121 15:44:16.953952 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb00c4a6e_de8c_43d7_a190_bd512a63f9de.slice/crio-1773a84ce0a54d4dc840dd1f2f00620a8c67d94a1b894b84acef296ba86275b0 WatchSource:0}: Error finding container 1773a84ce0a54d4dc840dd1f2f00620a8c67d94a1b894b84acef296ba86275b0: Status 404 returned error can't find the container with id 1773a84ce0a54d4dc840dd1f2f00620a8c67d94a1b894b84acef296ba86275b0 Jan 21 15:44:16 crc kubenswrapper[4773]: I0121 15:44:16.975969 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7"] Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.235970 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" event={"ID":"103932c1-cb6f-4a90-9d70-0dcc1787a5b7","Type":"ContainerStarted","Data":"f1ca47f0b15228a230b7556126d3d121a048b78cf5929d8cde4214a27c516e28"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.236788 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" event={"ID":"45eeec38-a51c-4228-8616-335dc3b951b7","Type":"ContainerStarted","Data":"dd593db5a9557ec1c20f9bf4383b77a084f3b53c448e6d1db3ed74e1f13654a4"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.237866 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"68c0e8c6-bc28-4101-a1d5-99ce639ae62c","Type":"ContainerStarted","Data":"4f9aeb8155406c8f415b539ff564df5a471b1606964f1cee496d1654dc471e60"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.243966 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" event={"ID":"3bd14b5f-6a17-41d2-bd18-522651121850","Type":"ContainerStarted","Data":"d8cd5b0770441dcdb27e22ac9ee40d0ba64c1c5c0c4602b3da79ba554c057bdb"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.262301 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" event={"ID":"b00c4a6e-de8c-43d7-a190-bd512a63f9de","Type":"ContainerStarted","Data":"1773a84ce0a54d4dc840dd1f2f00620a8c67d94a1b894b84acef296ba86275b0"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.263678 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"9e04988a-e98a-4c9d-9a51-e0e69a6810c9","Type":"ContainerStarted","Data":"acf7121c08237e7cce7c7c428e0a92e8879c2709fc6a762639585da78bcda3f3"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.269916 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" event={"ID":"e87ce1ac-65bc-4e61-a72a-381b6f7653f9","Type":"ContainerStarted","Data":"bde913a896b04e98c73329eda587b235b0dddafcecabdb5c25239446e3fbf041"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.284209 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zwrfs" event={"ID":"1f582857-cae4-4fa2-896d-b763b224ad8e","Type":"ContainerStarted","Data":"795bc3acadf5de449a41718e4311ca9dd0f4e48b5a31e10a69af1995df116ef7"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.305016 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"088efb4e-fd31-4648-88cf-ceac1edb1723","Type":"ContainerStarted","Data":"93f90e2bf6d1c316a74621f10e888d26d0c5dda780d094ccc051f094b2e72b12"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.311176 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"44cd53f0-37e0-4b02-9922-49d99dfee92a","Type":"ContainerStarted","Data":"973186d499441bc43715adb8e128b344dcd06c7f43c1efdd08b0dbea7f6fe0ad"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.313498 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"056b8b391d1fca843084a7f1dcee0b88446478caa5b2f33055adc27b73ac99d3"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.315777 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7656b240-dd39-4bd3-8cdb-2f5103f17656","Type":"ContainerStarted","Data":"dcd110a4d6ab4f24edf6ee6f2a09924050b8bc97996d9a43ee43ac7d8d31a4aa"} Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.398659 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b422d823-b626-442e-8b51-2439c4046d71" path="/var/lib/kubelet/pods/b422d823-b626-442e-8b51-2439c4046d71/volumes" Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.399131 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0aa7494-cbc8-4bca-bf38-33ef045d689b" path="/var/lib/kubelet/pods/f0aa7494-cbc8-4bca-bf38-33ef045d689b/volumes" Jan 21 15:44:17 crc kubenswrapper[4773]: I0121 15:44:17.815948 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mvwkv"] Jan 21 15:44:18 crc kubenswrapper[4773]: I0121 15:44:18.346133 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mvwkv" event={"ID":"b457bfe0-3f48-4e19-88a8-2b1ccefa549f","Type":"ContainerStarted","Data":"e5755df33f21c15595ef7d6c3a3a7ab80efc4a56a3acd07039f7e42c2cfe6eea"} Jan 21 15:44:19 crc kubenswrapper[4773]: I0121 15:44:19.356704 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23","Type":"ContainerStarted","Data":"45cfcc11228d723dedf07ab470919a57d8b9a7369e6e6b83008e0cd54ea3512d"} Jan 21 15:44:20 crc kubenswrapper[4773]: I0121 15:44:20.371850 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8f6a4485-3dde-469d-98ee-026edcc3eb76","Type":"ContainerStarted","Data":"8917942ef99a8fdda7f14e9937faa53f28cd27e19738203e3a861be0d994e171"} Jan 21 15:44:26 crc kubenswrapper[4773]: I0121 15:44:26.425733 4773 generic.go:334] "Generic (PLEG): container finished" podID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerID="45cfcc11228d723dedf07ab470919a57d8b9a7369e6e6b83008e0cd54ea3512d" exitCode=0 Jan 21 15:44:26 crc kubenswrapper[4773]: I0121 15:44:26.425801 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23","Type":"ContainerDied","Data":"45cfcc11228d723dedf07ab470919a57d8b9a7369e6e6b83008e0cd54ea3512d"} Jan 21 15:44:26 crc kubenswrapper[4773]: I0121 15:44:26.429302 4773 generic.go:334] "Generic (PLEG): container finished" podID="8f6a4485-3dde-469d-98ee-026edcc3eb76" containerID="8917942ef99a8fdda7f14e9937faa53f28cd27e19738203e3a861be0d994e171" exitCode=0 Jan 21 15:44:26 crc kubenswrapper[4773]: I0121 15:44:26.429339 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8f6a4485-3dde-469d-98ee-026edcc3eb76","Type":"ContainerDied","Data":"8917942ef99a8fdda7f14e9937faa53f28cd27e19738203e3a861be0d994e171"} Jan 21 15:44:34 crc kubenswrapper[4773]: I0121 15:44:34.495034 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mvwkv" event={"ID":"b457bfe0-3f48-4e19-88a8-2b1ccefa549f","Type":"ContainerStarted","Data":"b691a7ceb875b668136444e2ce46b68c3d08ea9708546b3cac0bf523bc860a9e"} Jan 21 15:44:35 crc kubenswrapper[4773]: I0121 15:44:35.504376 4773 generic.go:334] "Generic (PLEG): container finished" podID="b457bfe0-3f48-4e19-88a8-2b1ccefa549f" containerID="b691a7ceb875b668136444e2ce46b68c3d08ea9708546b3cac0bf523bc860a9e" exitCode=0 Jan 21 15:44:35 crc kubenswrapper[4773]: I0121 15:44:35.504439 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mvwkv" event={"ID":"b457bfe0-3f48-4e19-88a8-2b1ccefa549f","Type":"ContainerDied","Data":"b691a7ceb875b668136444e2ce46b68c3d08ea9708546b3cac0bf523bc860a9e"} Jan 21 15:44:36 crc kubenswrapper[4773]: I0121 15:44:36.525771 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zwrfs" event={"ID":"1f582857-cae4-4fa2-896d-b763b224ad8e","Type":"ContainerStarted","Data":"3f36d57437010ef44a8b3d542cc25d018fcd272178ca175ce6a87305f1ac006b"} Jan 21 15:44:36 crc kubenswrapper[4773]: I0121 15:44:36.526314 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-zwrfs" Jan 21 15:44:36 crc kubenswrapper[4773]: I0121 15:44:36.527923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" event={"ID":"103932c1-cb6f-4a90-9d70-0dcc1787a5b7","Type":"ContainerStarted","Data":"975701e9221a971b9fa92859ab0b96640045358fb2cb71eba65ba12209f024ce"} Jan 21 15:44:36 crc kubenswrapper[4773]: I0121 15:44:36.528088 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:44:36 crc kubenswrapper[4773]: I0121 15:44:36.530267 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" event={"ID":"e87ce1ac-65bc-4e61-a72a-381b6f7653f9","Type":"ContainerStarted","Data":"f964c801cd7e1fc9fbdc2ab9861a1cc1af2306d9d7373cb8c459bb099b8297f6"} Jan 21 15:44:36 crc kubenswrapper[4773]: I0121 15:44:36.530379 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:44:36 crc kubenswrapper[4773]: I0121 15:44:36.553688 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zwrfs" podStartSLOduration=55.584345551 podStartE2EDuration="1m7.553665907s" podCreationTimestamp="2026-01-21 15:43:29 +0000 UTC" firstStartedPulling="2026-01-21 15:44:16.766104737 +0000 UTC m=+1221.690594349" lastFinishedPulling="2026-01-21 15:44:28.735425083 +0000 UTC m=+1233.659914705" observedRunningTime="2026-01-21 15:44:36.544843215 +0000 UTC m=+1241.469332857" watchObservedRunningTime="2026-01-21 15:44:36.553665907 +0000 UTC m=+1241.478155539" Jan 21 15:44:36 crc kubenswrapper[4773]: I0121 15:44:36.589572 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" podStartSLOduration=46.845140244 podStartE2EDuration="58.58955039s" podCreationTimestamp="2026-01-21 15:43:38 +0000 UTC" firstStartedPulling="2026-01-21 15:44:16.989332201 +0000 UTC m=+1221.913821823" lastFinishedPulling="2026-01-21 15:44:28.733742347 +0000 UTC m=+1233.658231969" observedRunningTime="2026-01-21 15:44:36.566951431 +0000 UTC m=+1241.491441053" watchObservedRunningTime="2026-01-21 15:44:36.58955039 +0000 UTC m=+1241.514040012" Jan 21 15:44:36 crc kubenswrapper[4773]: I0121 15:44:36.593930 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" podStartSLOduration=46.501491751 podStartE2EDuration="58.593913999s" podCreationTimestamp="2026-01-21 15:43:38 +0000 UTC" firstStartedPulling="2026-01-21 15:44:16.640168968 +0000 UTC m=+1221.564658590" lastFinishedPulling="2026-01-21 15:44:28.732591216 +0000 UTC m=+1233.657080838" observedRunningTime="2026-01-21 15:44:36.587217456 +0000 UTC m=+1241.511707078" watchObservedRunningTime="2026-01-21 15:44:36.593913999 +0000 UTC m=+1241.518403611" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.540217 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"9e04988a-e98a-4c9d-9a51-e0e69a6810c9","Type":"ContainerStarted","Data":"aab694ac6410ded056e24062ee46804e5ed47e0590464ac525c25dc0b167ae83"} Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.540615 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.543978 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" event={"ID":"45eeec38-a51c-4228-8616-335dc3b951b7","Type":"ContainerStarted","Data":"f2509defd6a00f83264f644f32604066bf9d2c630e7ac8c977f2b26cf4cd0912"} Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.544190 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.547428 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7656b240-dd39-4bd3-8cdb-2f5103f17656","Type":"ContainerStarted","Data":"14e4e8bf8f9ea68c3a49851b9f3f33e434132e57bb85ffdc7abd6353a81feed9"} Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.550072 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"68c0e8c6-bc28-4101-a1d5-99ce639ae62c","Type":"ContainerStarted","Data":"b88c17db84e2942e654c5e5065243884227d47fec7df99f2beb879a2761f60da"} Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.550186 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.552493 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" event={"ID":"3bd14b5f-6a17-41d2-bd18-522651121850","Type":"ContainerStarted","Data":"48915b678008d2482e47e231d33eb00988558b4c96d4d7a843c52caa62b7f2ec"} Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.552802 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.555287 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"869ad9c0-3593-4ebc-9b58-7b9615e46927","Type":"ContainerStarted","Data":"87e6e51434ba11d721cb71c15b5ff6804207bda93141d4700ab8702b602bb459"} Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.558750 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=44.93982368 podStartE2EDuration="58.558735297s" podCreationTimestamp="2026-01-21 15:43:39 +0000 UTC" firstStartedPulling="2026-01-21 15:44:16.57805495 +0000 UTC m=+1221.502544572" lastFinishedPulling="2026-01-21 15:44:30.196966567 +0000 UTC m=+1235.121456189" observedRunningTime="2026-01-21 15:44:37.558592093 +0000 UTC m=+1242.483081715" watchObservedRunningTime="2026-01-21 15:44:37.558735297 +0000 UTC m=+1242.483224919" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.564719 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" event={"ID":"b00c4a6e-de8c-43d7-a190-bd512a63f9de","Type":"ContainerStarted","Data":"5a6faf308798e537e1ba64f22eb8a9f275e8387f7454b1ab236a3d79e07d8e83"} Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.565705 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.570028 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32888aa3-cb52-484f-9745-5d5dfc5179df","Type":"ContainerStarted","Data":"80b2953f89db8a0aaea03e3ec256b914b62a674cfe4c0a7ffc14f128e4fc91e7"} Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.572018 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.573180 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.585069 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s" podStartSLOduration=46.792943605 podStartE2EDuration="58.585054098s" podCreationTimestamp="2026-01-21 15:43:39 +0000 UTC" firstStartedPulling="2026-01-21 15:44:16.940489563 +0000 UTC m=+1221.864979185" lastFinishedPulling="2026-01-21 15:44:28.732600056 +0000 UTC m=+1233.657089678" observedRunningTime="2026-01-21 15:44:37.581368457 +0000 UTC m=+1242.505858079" watchObservedRunningTime="2026-01-21 15:44:37.585054098 +0000 UTC m=+1242.509543720" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.605486 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=47.14934668 podStartE2EDuration="59.605468827s" podCreationTimestamp="2026-01-21 15:43:38 +0000 UTC" firstStartedPulling="2026-01-21 15:44:16.93146397 +0000 UTC m=+1221.855953592" lastFinishedPulling="2026-01-21 15:44:29.387586117 +0000 UTC m=+1234.312075739" observedRunningTime="2026-01-21 15:44:37.602627189 +0000 UTC m=+1242.527116811" watchObservedRunningTime="2026-01-21 15:44:37.605468827 +0000 UTC m=+1242.529958449" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.672114 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr" podStartSLOduration=46.041019872 podStartE2EDuration="58.672096492s" podCreationTimestamp="2026-01-21 15:43:39 +0000 UTC" firstStartedPulling="2026-01-21 15:44:16.75551834 +0000 UTC m=+1221.680007962" lastFinishedPulling="2026-01-21 15:44:29.38659496 +0000 UTC m=+1234.311084582" observedRunningTime="2026-01-21 15:44:37.645084253 +0000 UTC m=+1242.569573875" watchObservedRunningTime="2026-01-21 15:44:37.672096492 +0000 UTC m=+1242.596586114" Jan 21 15:44:37 crc kubenswrapper[4773]: I0121 15:44:37.736809 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" podStartSLOduration=46.136468265 podStartE2EDuration="59.736782764s" podCreationTimestamp="2026-01-21 15:43:38 +0000 UTC" firstStartedPulling="2026-01-21 15:44:16.956170736 +0000 UTC m=+1221.880660358" lastFinishedPulling="2026-01-21 15:44:30.556485235 +0000 UTC m=+1235.480974857" observedRunningTime="2026-01-21 15:44:37.724842307 +0000 UTC m=+1242.649331929" watchObservedRunningTime="2026-01-21 15:44:37.736782764 +0000 UTC m=+1242.661272396" Jan 21 15:44:42 crc kubenswrapper[4773]: I0121 15:44:42.609023 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1849053d-528d-42bf-93f3-31cb3ef1c91e","Type":"ContainerStarted","Data":"67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469"} Jan 21 15:44:42 crc kubenswrapper[4773]: I0121 15:44:42.620467 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8f6a4485-3dde-469d-98ee-026edcc3eb76","Type":"ContainerStarted","Data":"40f289d5a645dfc594eb2a1c88b2ecc4565a4aaf567e2fb1aaa1d836a07796a4"} Jan 21 15:44:42 crc kubenswrapper[4773]: I0121 15:44:42.622571 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"44cd53f0-37e0-4b02-9922-49d99dfee92a","Type":"ContainerStarted","Data":"1c37f4fcff47e63de4c9d71da47a9544d448f33640b35d102d7e8461f597c337"} Jan 21 15:44:42 crc kubenswrapper[4773]: I0121 15:44:42.623935 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" event={"ID":"e7edb06e-2b04-4d03-b089-4724a466d720","Type":"ContainerStarted","Data":"7b4c0d39628d4adc9c6e17e0b7ec02c9bbcd1c5f07d5a898a70a8b53b9a96d66"} Jan 21 15:44:42 crc kubenswrapper[4773]: I0121 15:44:42.628861 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mvwkv" event={"ID":"b457bfe0-3f48-4e19-88a8-2b1ccefa549f","Type":"ContainerStarted","Data":"9c22f6f885439bdae5c36d6cedb070dd6d6c93e4712c0f8055efa7f1648363f9"} Jan 21 15:44:42 crc kubenswrapper[4773]: I0121 15:44:42.630498 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"eb62c429-ac2b-4654-84e2-c92b0508eba4","Type":"ContainerStarted","Data":"38b10d82e9e127388b6e12c489be1e4f7fcda12c44e064cf1e471ec4556c38a4"} Jan 21 15:44:42 crc kubenswrapper[4773]: I0121 15:44:42.633152 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" event={"ID":"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff","Type":"ContainerStarted","Data":"82f759fcde8fbf130ede62373f729e90f94668088a233680bc741fe96fc65f41"} Jan 21 15:44:42 crc kubenswrapper[4773]: I0121 15:44:42.634344 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5b1166d-2f9b-452c-a0b2-e7f21998ff45","Type":"ContainerStarted","Data":"a2d82cd99a3e89e61c6361ec20503a0529ff9b77bb61e768caaf32d1b0602c8e"} Jan 21 15:44:42 crc kubenswrapper[4773]: I0121 15:44:42.639597 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"088efb4e-fd31-4648-88cf-ceac1edb1723","Type":"ContainerStarted","Data":"c320a550f480bb31abbc242486c839d2aaa18cbaaf352552e6f3355d0eeb37eb"} Jan 21 15:44:42 crc kubenswrapper[4773]: I0121 15:44:42.643169 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7552a02-8d95-4fce-b6b0-7bbac761ad35","Type":"ContainerStarted","Data":"d6d7bd2ee2da7ff8ab8404974ba9925a73d46790cc3cfe296164f20446bdd932"} Jan 21 15:44:42 crc kubenswrapper[4773]: I0121 15:44:42.644916 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23","Type":"ContainerStarted","Data":"bd218997989eb32d58d9b1df12ca198f1dceb6fe8b614b283eaafe8ec0424eb7"} Jan 21 15:44:49 crc kubenswrapper[4773]: I0121 15:44:49.700883 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:44:49 crc kubenswrapper[4773]: I0121 15:44:49.722561 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=57.92280365 podStartE2EDuration="1m11.722540951s" podCreationTimestamp="2026-01-21 15:43:38 +0000 UTC" firstStartedPulling="2026-01-21 15:44:16.756752814 +0000 UTC m=+1221.681242436" lastFinishedPulling="2026-01-21 15:44:30.556490115 +0000 UTC m=+1235.480979737" observedRunningTime="2026-01-21 15:44:49.718361147 +0000 UTC m=+1254.642850769" watchObservedRunningTime="2026-01-21 15:44:49.722540951 +0000 UTC m=+1254.647030573" Jan 21 15:44:49 crc kubenswrapper[4773]: I0121 15:44:49.763869 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.475261659 podStartE2EDuration="1m26.763844133s" podCreationTimestamp="2026-01-21 15:43:23 +0000 UTC" firstStartedPulling="2026-01-21 15:43:24.905867574 +0000 UTC m=+1169.830357196" lastFinishedPulling="2026-01-21 15:44:30.194450048 +0000 UTC m=+1235.118939670" observedRunningTime="2026-01-21 15:44:49.755998018 +0000 UTC m=+1254.680487660" watchObservedRunningTime="2026-01-21 15:44:49.763844133 +0000 UTC m=+1254.688333755" Jan 21 15:44:49 crc kubenswrapper[4773]: I0121 15:44:49.846006 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.896166114 podStartE2EDuration="1m24.845980453s" podCreationTimestamp="2026-01-21 15:43:25 +0000 UTC" firstStartedPulling="2026-01-21 15:43:27.192325767 +0000 UTC m=+1172.116815389" lastFinishedPulling="2026-01-21 15:44:37.142140106 +0000 UTC m=+1242.066629728" observedRunningTime="2026-01-21 15:44:49.838851857 +0000 UTC m=+1254.763341479" watchObservedRunningTime="2026-01-21 15:44:49.845980453 +0000 UTC m=+1254.770470075" Jan 21 15:44:54 crc kubenswrapper[4773]: I0121 15:44:54.205775 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 21 15:44:54 crc kubenswrapper[4773]: I0121 15:44:54.206945 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 21 15:44:55 crc kubenswrapper[4773]: I0121 15:44:55.752266 4773 generic.go:334] "Generic (PLEG): container finished" podID="e7edb06e-2b04-4d03-b089-4724a466d720" containerID="7b4c0d39628d4adc9c6e17e0b7ec02c9bbcd1c5f07d5a898a70a8b53b9a96d66" exitCode=0 Jan 21 15:44:55 crc kubenswrapper[4773]: I0121 15:44:55.752358 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" event={"ID":"e7edb06e-2b04-4d03-b089-4724a466d720","Type":"ContainerDied","Data":"7b4c0d39628d4adc9c6e17e0b7ec02c9bbcd1c5f07d5a898a70a8b53b9a96d66"} Jan 21 15:44:55 crc kubenswrapper[4773]: I0121 15:44:55.755882 4773 generic.go:334] "Generic (PLEG): container finished" podID="6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" containerID="82f759fcde8fbf130ede62373f729e90f94668088a233680bc741fe96fc65f41" exitCode=0 Jan 21 15:44:55 crc kubenswrapper[4773]: I0121 15:44:55.755918 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" event={"ID":"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff","Type":"ContainerDied","Data":"82f759fcde8fbf130ede62373f729e90f94668088a233680bc741fe96fc65f41"} Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.196092 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.216290 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.331070 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4hbk9"] Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.387142 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-5x7tr"] Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.388842 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.401043 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-5x7tr"] Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.434660 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-config\") pod \"dnsmasq-dns-7cb5889db5-5x7tr\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.434745 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-5x7tr\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.435259 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tz5j\" (UniqueName: \"kubernetes.io/projected/ddea5ab9-bac1-479d-ae24-daf4022ce73c-kube-api-access-8tz5j\") pod \"dnsmasq-dns-7cb5889db5-5x7tr\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.536735 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tz5j\" (UniqueName: \"kubernetes.io/projected/ddea5ab9-bac1-479d-ae24-daf4022ce73c-kube-api-access-8tz5j\") pod \"dnsmasq-dns-7cb5889db5-5x7tr\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.537057 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-config\") pod \"dnsmasq-dns-7cb5889db5-5x7tr\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.537109 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-5x7tr\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.538110 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-config\") pod \"dnsmasq-dns-7cb5889db5-5x7tr\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.538136 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-5x7tr\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.559094 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tz5j\" (UniqueName: \"kubernetes.io/projected/ddea5ab9-bac1-479d-ae24-daf4022ce73c-kube-api-access-8tz5j\") pod \"dnsmasq-dns-7cb5889db5-5x7tr\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.724151 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.897453 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8t22l"] Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.898690 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.901810 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.914361 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8t22l"] Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.945375 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsj9q\" (UniqueName: \"kubernetes.io/projected/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-kube-api-access-hsj9q\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.945633 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-ovn-rundir\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.945776 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-ovs-rundir\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.945863 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.945997 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-config\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:56 crc kubenswrapper[4773]: I0121 15:44:56.946074 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-combined-ca-bundle\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.047448 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsj9q\" (UniqueName: \"kubernetes.io/projected/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-kube-api-access-hsj9q\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.047785 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-ovn-rundir\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.047902 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-ovs-rundir\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.047994 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.048129 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-config\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.048218 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-combined-ca-bundle\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.048142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-ovs-rundir\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.048142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-ovn-rundir\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.049056 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-config\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.052003 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-combined-ca-bundle\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.064454 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.070266 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsj9q\" (UniqueName: \"kubernetes.io/projected/3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7-kube-api-access-hsj9q\") pod \"ovn-controller-metrics-8t22l\" (UID: \"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7\") " pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.217593 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8t22l" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.304692 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mldv7"] Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.330677 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-rxztr"] Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.333164 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.334851 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.350116 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-rxztr"] Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.443581 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-5x7tr"] Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.462439 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-vlcdh"] Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.463743 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.471101 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-config\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.471223 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.471257 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb5hb\" (UniqueName: \"kubernetes.io/projected/9c654304-9ce6-4243-9273-bfd23bdc0ac8-kube-api-access-vb5hb\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.471305 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.471992 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-vlcdh"] Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.476154 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.498433 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.510020 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.515272 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.515868 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.516212 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.518723 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-v6fk7" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.552584 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.572961 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fdb605d6-a9d2-433a-b78d-c70638d5ec9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdb605d6-a9d2-433a-b78d-c70638d5ec9d\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573035 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-config\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573070 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-config\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573110 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rg64\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-kube-api-access-4rg64\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573203 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573246 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/111512a9-4e17-4433-a7e9-e8666099d12f-lock\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573273 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzr9\" (UniqueName: \"kubernetes.io/projected/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-kube-api-access-bzzr9\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573300 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573326 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573355 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-dns-svc\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573409 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb5hb\" (UniqueName: \"kubernetes.io/projected/9c654304-9ce6-4243-9273-bfd23bdc0ac8-kube-api-access-vb5hb\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573445 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/111512a9-4e17-4433-a7e9-e8666099d12f-cache\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573483 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.573623 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.574197 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.574507 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.574515 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-config\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.598623 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb5hb\" (UniqueName: \"kubernetes.io/projected/9c654304-9ce6-4243-9273-bfd23bdc0ac8-kube-api-access-vb5hb\") pod \"dnsmasq-dns-6c89d5d749-rxztr\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.649946 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.674998 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.675056 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fdb605d6-a9d2-433a-b78d-c70638d5ec9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdb605d6-a9d2-433a-b78d-c70638d5ec9d\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.675088 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-config\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.675117 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rg64\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-kube-api-access-4rg64\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.675164 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.675192 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/111512a9-4e17-4433-a7e9-e8666099d12f-lock\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.675211 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzr9\" (UniqueName: \"kubernetes.io/projected/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-kube-api-access-bzzr9\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.675227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.675248 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-dns-svc\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.675275 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/111512a9-4e17-4433-a7e9-e8666099d12f-cache\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: E0121 15:44:57.675763 4773 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:44:57 crc kubenswrapper[4773]: E0121 15:44:57.675896 4773 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:44:57 crc kubenswrapper[4773]: E0121 15:44:57.676034 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift podName:111512a9-4e17-4433-a7e9-e8666099d12f nodeName:}" failed. No retries permitted until 2026-01-21 15:44:58.176010809 +0000 UTC m=+1263.100500431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift") pod "swift-storage-0" (UID: "111512a9-4e17-4433-a7e9-e8666099d12f") : configmap "swift-ring-files" not found Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.676400 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/111512a9-4e17-4433-a7e9-e8666099d12f-lock\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.676462 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/111512a9-4e17-4433-a7e9-e8666099d12f-cache\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.676606 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.676797 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.676793 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-dns-svc\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.677316 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-config\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.678478 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.678507 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fdb605d6-a9d2-433a-b78d-c70638d5ec9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdb605d6-a9d2-433a-b78d-c70638d5ec9d\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/32c47ed224eef9ee94017740e71262c66432d349432ba5f0b3aeccca58fe626d/globalmount\"" pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.696683 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rg64\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-kube-api-access-4rg64\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.698359 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzr9\" (UniqueName: \"kubernetes.io/projected/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-kube-api-access-bzzr9\") pod \"dnsmasq-dns-698758b865-vlcdh\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.714646 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fdb605d6-a9d2-433a-b78d-c70638d5ec9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdb605d6-a9d2-433a-b78d-c70638d5ec9d\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:57 crc kubenswrapper[4773]: I0121 15:44:57.784027 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:44:58 crc kubenswrapper[4773]: I0121 15:44:58.185709 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:58 crc kubenswrapper[4773]: E0121 15:44:58.185856 4773 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:44:58 crc kubenswrapper[4773]: E0121 15:44:58.186143 4773 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:44:58 crc kubenswrapper[4773]: E0121 15:44:58.186205 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift podName:111512a9-4e17-4433-a7e9-e8666099d12f nodeName:}" failed. No retries permitted until 2026-01-21 15:44:59.186185193 +0000 UTC m=+1264.110674815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift") pod "swift-storage-0" (UID: "111512a9-4e17-4433-a7e9-e8666099d12f") : configmap "swift-ring-files" not found Jan 21 15:44:59 crc kubenswrapper[4773]: I0121 15:44:59.097233 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-xkhx7" Jan 21 15:44:59 crc kubenswrapper[4773]: I0121 15:44:59.203510 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:44:59 crc kubenswrapper[4773]: E0121 15:44:59.204522 4773 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:44:59 crc kubenswrapper[4773]: E0121 15:44:59.204538 4773 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:44:59 crc kubenswrapper[4773]: E0121 15:44:59.204581 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift podName:111512a9-4e17-4433-a7e9-e8666099d12f nodeName:}" failed. No retries permitted until 2026-01-21 15:45:01.204564048 +0000 UTC m=+1266.129053680 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift") pod "swift-storage-0" (UID: "111512a9-4e17-4433-a7e9-e8666099d12f") : configmap "swift-ring-files" not found Jan 21 15:44:59 crc kubenswrapper[4773]: I0121 15:44:59.266785 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-9886v" Jan 21 15:44:59 crc kubenswrapper[4773]: I0121 15:44:59.346187 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-phpsd" Jan 21 15:44:59 crc kubenswrapper[4773]: E0121 15:44:59.950021 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified" Jan 21 15:44:59 crc kubenswrapper[4773]: E0121 15:44:59.951015 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},EnvVar{Name:CONFIG_HASH,Value:nd4h75h58h67ch644h67ch67fh678h647h67fh84h557h578h5d8h584h6fh9dh5bfh655h646hcch5f9h55dh5d9h6fh669h56ch679h67h55dh694h85q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58gv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(7656b240-dd39-4bd3-8cdb-2f5103f17656): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:44:59 crc kubenswrapper[4773]: E0121 15:44:59.953863 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="7656b240-dd39-4bd3-8cdb-2f5103f17656" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.144360 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96"] Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.146231 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.153213 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.153310 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.162899 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96"] Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.230891 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/527a62a3-540f-4352-903c-184f60e613a7-config-volume\") pod \"collect-profiles-29483505-zgc96\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.231210 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/527a62a3-540f-4352-903c-184f60e613a7-secret-volume\") pod \"collect-profiles-29483505-zgc96\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.231344 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgtcw\" (UniqueName: \"kubernetes.io/projected/527a62a3-540f-4352-903c-184f60e613a7-kube-api-access-fgtcw\") pod \"collect-profiles-29483505-zgc96\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.283899 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="68c0e8c6-bc28-4101-a1d5-99ce639ae62c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.336915 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgtcw\" (UniqueName: \"kubernetes.io/projected/527a62a3-540f-4352-903c-184f60e613a7-kube-api-access-fgtcw\") pod \"collect-profiles-29483505-zgc96\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.344969 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/527a62a3-540f-4352-903c-184f60e613a7-config-volume\") pod \"collect-profiles-29483505-zgc96\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.345071 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/527a62a3-540f-4352-903c-184f60e613a7-secret-volume\") pod \"collect-profiles-29483505-zgc96\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.346174 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/527a62a3-540f-4352-903c-184f60e613a7-config-volume\") pod \"collect-profiles-29483505-zgc96\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.362600 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.363817 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/527a62a3-540f-4352-903c-184f60e613a7-secret-volume\") pod \"collect-profiles-29483505-zgc96\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.448995 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 21 15:45:00 crc kubenswrapper[4773]: W0121 15:45:00.594068 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ee4f520_ec7d_4b0e_8138_ef2fbcc559b7.slice/crio-6889b91fe4ecbb1e7b4e9da39e018d46ef841d8fa67a7f3cf39fa922e78e4687 WatchSource:0}: Error finding container 6889b91fe4ecbb1e7b4e9da39e018d46ef841d8fa67a7f3cf39fa922e78e4687: Status 404 returned error can't find the container with id 6889b91fe4ecbb1e7b4e9da39e018d46ef841d8fa67a7f3cf39fa922e78e4687 Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.594839 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8t22l"] Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.748792 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-5x7tr"] Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.801817 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8t22l" event={"ID":"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7","Type":"ContainerStarted","Data":"6889b91fe4ecbb1e7b4e9da39e018d46ef841d8fa67a7f3cf39fa922e78e4687"} Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.802258 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-rxztr"] Jan 21 15:45:00 crc kubenswrapper[4773]: I0121 15:45:00.828735 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-vlcdh"] Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.195635 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgtcw\" (UniqueName: \"kubernetes.io/projected/527a62a3-540f-4352-903c-184f60e613a7-kube-api-access-fgtcw\") pod \"collect-profiles-29483505-zgc96\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:01 crc kubenswrapper[4773]: W0121 15:45:01.210900 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode48bc43f_55bc_4b6a_bc8a_dac53e6549cd.slice/crio-f0fb75b378ce7bf7a9f4443448c5fb834dbf432b1fdb495aa024273dffee4b21 WatchSource:0}: Error finding container f0fb75b378ce7bf7a9f4443448c5fb834dbf432b1fdb495aa024273dffee4b21: Status 404 returned error can't find the container with id f0fb75b378ce7bf7a9f4443448c5fb834dbf432b1fdb495aa024273dffee4b21 Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.274250 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:45:01 crc kubenswrapper[4773]: E0121 15:45:01.274457 4773 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:45:01 crc kubenswrapper[4773]: E0121 15:45:01.274472 4773 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:45:01 crc kubenswrapper[4773]: E0121 15:45:01.274512 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift podName:111512a9-4e17-4433-a7e9-e8666099d12f nodeName:}" failed. No retries permitted until 2026-01-21 15:45:05.274498947 +0000 UTC m=+1270.198988569 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift") pod "swift-storage-0" (UID: "111512a9-4e17-4433-a7e9-e8666099d12f") : configmap "swift-ring-files" not found Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.363573 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-k6w2q"] Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.365272 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.369271 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.369553 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.369620 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.375222 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-k6w2q"] Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.477001 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.477484 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-dispersionconf\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.477585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-combined-ca-bundle\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.477726 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-swiftconf\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.477811 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-ring-data-devices\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.477918 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-scripts\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.477979 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x79jj\" (UniqueName: \"kubernetes.io/projected/626ecec5-3380-45fa-a2b1-248ee0af1328-kube-api-access-x79jj\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.478050 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/626ecec5-3380-45fa-a2b1-248ee0af1328-etc-swift\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.579647 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-scripts\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.580324 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x79jj\" (UniqueName: \"kubernetes.io/projected/626ecec5-3380-45fa-a2b1-248ee0af1328-kube-api-access-x79jj\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.580398 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/626ecec5-3380-45fa-a2b1-248ee0af1328-etc-swift\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.580470 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-dispersionconf\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.580513 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-combined-ca-bundle\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.580562 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-swiftconf\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.580586 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-ring-data-devices\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.582439 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/626ecec5-3380-45fa-a2b1-248ee0af1328-etc-swift\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.584279 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-scripts\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.584522 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-ring-data-devices\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.587036 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-swiftconf\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.591087 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-combined-ca-bundle\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.604000 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x79jj\" (UniqueName: \"kubernetes.io/projected/626ecec5-3380-45fa-a2b1-248ee0af1328-kube-api-access-x79jj\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.612299 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-dispersionconf\") pod \"swift-ring-rebalance-k6w2q\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.696037 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.817391 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" event={"ID":"9c654304-9ce6-4243-9273-bfd23bdc0ac8","Type":"ContainerStarted","Data":"eaa76630334b7451465f0e1673abbf410718db0fc3b7a50c0bb61c0d61727917"} Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.828040 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-vlcdh" event={"ID":"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd","Type":"ContainerStarted","Data":"f0fb75b378ce7bf7a9f4443448c5fb834dbf432b1fdb495aa024273dffee4b21"} Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.829455 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" event={"ID":"ddea5ab9-bac1-479d-ae24-daf4022ce73c","Type":"ContainerStarted","Data":"9fc62fd5a01af62a970b6c4092175c78983facff7e78a8bb7518c4d48541b1aa"} Jan 21 15:45:01 crc kubenswrapper[4773]: I0121 15:45:01.924183 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96"] Jan 21 15:45:02 crc kubenswrapper[4773]: W0121 15:45:02.010647 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod527a62a3_540f_4352_903c_184f60e613a7.slice/crio-ecf8f149d00b6f5337a675b63c3cd9b2ec0db7361aac8edb4f40922479d08cd4 WatchSource:0}: Error finding container ecf8f149d00b6f5337a675b63c3cd9b2ec0db7361aac8edb4f40922479d08cd4: Status 404 returned error can't find the container with id ecf8f149d00b6f5337a675b63c3cd9b2ec0db7361aac8edb4f40922479d08cd4 Jan 21 15:45:02 crc kubenswrapper[4773]: I0121 15:45:02.235313 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-k6w2q"] Jan 21 15:45:02 crc kubenswrapper[4773]: I0121 15:45:02.716032 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 21 15:45:02 crc kubenswrapper[4773]: I0121 15:45:02.836983 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-k6w2q" event={"ID":"626ecec5-3380-45fa-a2b1-248ee0af1328","Type":"ContainerStarted","Data":"fd8e7a48bf4ce79e0c7da661b9da8569880f31952aaffc7cf65c50a75c0ca382"} Jan 21 15:45:02 crc kubenswrapper[4773]: I0121 15:45:02.838415 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" event={"ID":"527a62a3-540f-4352-903c-184f60e613a7","Type":"ContainerStarted","Data":"ecf8f149d00b6f5337a675b63c3cd9b2ec0db7361aac8edb4f40922479d08cd4"} Jan 21 15:45:02 crc kubenswrapper[4773]: I0121 15:45:02.840270 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.847120 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"088efb4e-fd31-4648-88cf-ceac1edb1723","Type":"ContainerStarted","Data":"e4f9fd24a8e67bc5ae53a76d0c86b70dc6b11539d2112e35fd2bf09e1ba225cf"} Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.848616 4773 generic.go:334] "Generic (PLEG): container finished" podID="9c654304-9ce6-4243-9273-bfd23bdc0ac8" containerID="49537f0bec4d3f0c885d9e14674033edaf91b18514cee98b5d240ca40b42c456" exitCode=0 Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.848680 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" event={"ID":"9c654304-9ce6-4243-9273-bfd23bdc0ac8","Type":"ContainerDied","Data":"49537f0bec4d3f0c885d9e14674033edaf91b18514cee98b5d240ca40b42c456"} Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.850448 4773 generic.go:334] "Generic (PLEG): container finished" podID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" containerID="3314ab0a44d2eda36e92c163faa40a03e3341d9da1050bccfbe5b21921f56dc9" exitCode=0 Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.850524 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-vlcdh" event={"ID":"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd","Type":"ContainerDied","Data":"3314ab0a44d2eda36e92c163faa40a03e3341d9da1050bccfbe5b21921f56dc9"} Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.853301 4773 generic.go:334] "Generic (PLEG): container finished" podID="ddea5ab9-bac1-479d-ae24-daf4022ce73c" containerID="0e7e78e780978dc11c769c10c1dbb1667d840006a1bc6c4d516bdb2da9625968" exitCode=0 Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.853355 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" event={"ID":"ddea5ab9-bac1-479d-ae24-daf4022ce73c","Type":"ContainerDied","Data":"0e7e78e780978dc11c769c10c1dbb1667d840006a1bc6c4d516bdb2da9625968"} Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.856304 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" event={"ID":"e7edb06e-2b04-4d03-b089-4724a466d720","Type":"ContainerStarted","Data":"2699d2e9d1e6fcda87df145aa3deda82a85b1086c240c11dd4d40d418b1e0ead"} Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.856349 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.856289 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" podUID="e7edb06e-2b04-4d03-b089-4724a466d720" containerName="dnsmasq-dns" containerID="cri-o://2699d2e9d1e6fcda87df145aa3deda82a85b1086c240c11dd4d40d418b1e0ead" gracePeriod=10 Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.862287 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mvwkv" event={"ID":"b457bfe0-3f48-4e19-88a8-2b1ccefa549f","Type":"ContainerStarted","Data":"f5d028d753b671d211798c3ea81641c57f0e7786e87314eb68dad1472fbeca30"} Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.863328 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.863427 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.866584 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" event={"ID":"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff","Type":"ContainerStarted","Data":"32cea80d998ff921900ecad1ba3e95a637e9903607284e452d943be681cbee64"} Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.869965 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" podUID="6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" containerName="dnsmasq-dns" containerID="cri-o://32cea80d998ff921900ecad1ba3e95a637e9903607284e452d943be681cbee64" gracePeriod=10 Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.871144 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.877565 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=51.624960269 podStartE2EDuration="1m34.877547558s" podCreationTimestamp="2026-01-21 15:43:29 +0000 UTC" firstStartedPulling="2026-01-21 15:44:16.807596346 +0000 UTC m=+1221.732085968" lastFinishedPulling="2026-01-21 15:45:00.060183635 +0000 UTC m=+1264.984673257" observedRunningTime="2026-01-21 15:45:03.876147089 +0000 UTC m=+1268.800636711" watchObservedRunningTime="2026-01-21 15:45:03.877547558 +0000 UTC m=+1268.802037200" Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.880181 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8f6a4485-3dde-469d-98ee-026edcc3eb76","Type":"ContainerStarted","Data":"0cfdbfaaaf756417835e67301cf3551d57ffa5b517e4d53b9363fa4f4706173b"} Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.882400 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7656b240-dd39-4bd3-8cdb-2f5103f17656","Type":"ContainerStarted","Data":"82fc35ff410320833acc26f4050318560b22a80ce0cfaa454ce10893dc31e39f"} Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.884455 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" event={"ID":"527a62a3-540f-4352-903c-184f60e613a7","Type":"ContainerStarted","Data":"349ce91b7ddd85303a5dfbaac856165d8f9cf548169584d19d626175eb2cc750"} Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.886266 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8t22l" event={"ID":"3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7","Type":"ContainerStarted","Data":"cbdac50a3326fe419c45dc45f4680d679cae1d1f2d0f55032ecb26b702575c1b"} Jan 21 15:45:03 crc kubenswrapper[4773]: I0121 15:45:03.902973 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" podStartSLOduration=36.558391155 podStartE2EDuration="1m44.902952724s" podCreationTimestamp="2026-01-21 15:43:19 +0000 UTC" firstStartedPulling="2026-01-21 15:43:20.389427485 +0000 UTC m=+1165.313917107" lastFinishedPulling="2026-01-21 15:44:28.733989054 +0000 UTC m=+1233.658478676" observedRunningTime="2026-01-21 15:45:03.899074547 +0000 UTC m=+1268.823564189" watchObservedRunningTime="2026-01-21 15:45:03.902952724 +0000 UTC m=+1268.827442366" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.008873 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-mvwkv" podStartSLOduration=84.430753391 podStartE2EDuration="1m35.008853535s" podCreationTimestamp="2026-01-21 15:43:29 +0000 UTC" firstStartedPulling="2026-01-21 15:44:18.023223376 +0000 UTC m=+1222.947712998" lastFinishedPulling="2026-01-21 15:44:28.60132352 +0000 UTC m=+1233.525813142" observedRunningTime="2026-01-21 15:45:03.976600241 +0000 UTC m=+1268.901089883" watchObservedRunningTime="2026-01-21 15:45:04.008853535 +0000 UTC m=+1268.933343157" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.054068 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" podStartSLOduration=-9223371931.800724 podStartE2EDuration="1m45.054052002s" podCreationTimestamp="2026-01-21 15:43:19 +0000 UTC" firstStartedPulling="2026-01-21 15:43:20.655077604 +0000 UTC m=+1165.579567226" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:04.04774488 +0000 UTC m=+1268.972234512" watchObservedRunningTime="2026-01-21 15:45:04.054052002 +0000 UTC m=+1268.978541624" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.100549 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8t22l" podStartSLOduration=8.100523675 podStartE2EDuration="8.100523675s" podCreationTimestamp="2026-01-21 15:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:04.089681369 +0000 UTC m=+1269.014171021" watchObservedRunningTime="2026-01-21 15:45:04.100523675 +0000 UTC m=+1269.025013297" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.394650 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.441478 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-config\") pod \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.441826 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-dns-svc\") pod \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.441983 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tz5j\" (UniqueName: \"kubernetes.io/projected/ddea5ab9-bac1-479d-ae24-daf4022ce73c-kube-api-access-8tz5j\") pod \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\" (UID: \"ddea5ab9-bac1-479d-ae24-daf4022ce73c\") " Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.447113 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddea5ab9-bac1-479d-ae24-daf4022ce73c-kube-api-access-8tz5j" (OuterVolumeSpecName: "kube-api-access-8tz5j") pod "ddea5ab9-bac1-479d-ae24-daf4022ce73c" (UID: "ddea5ab9-bac1-479d-ae24-daf4022ce73c"). InnerVolumeSpecName "kube-api-access-8tz5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.463277 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-config" (OuterVolumeSpecName: "config") pod "ddea5ab9-bac1-479d-ae24-daf4022ce73c" (UID: "ddea5ab9-bac1-479d-ae24-daf4022ce73c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.463652 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ddea5ab9-bac1-479d-ae24-daf4022ce73c" (UID: "ddea5ab9-bac1-479d-ae24-daf4022ce73c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.544176 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.544214 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddea5ab9-bac1-479d-ae24-daf4022ce73c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.544229 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tz5j\" (UniqueName: \"kubernetes.io/projected/ddea5ab9-bac1-479d-ae24-daf4022ce73c-kube-api-access-8tz5j\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.898858 4773 generic.go:334] "Generic (PLEG): container finished" podID="e7edb06e-2b04-4d03-b089-4724a466d720" containerID="2699d2e9d1e6fcda87df145aa3deda82a85b1086c240c11dd4d40d418b1e0ead" exitCode=0 Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.898923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" event={"ID":"e7edb06e-2b04-4d03-b089-4724a466d720","Type":"ContainerDied","Data":"2699d2e9d1e6fcda87df145aa3deda82a85b1086c240c11dd4d40d418b1e0ead"} Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.901837 4773 generic.go:334] "Generic (PLEG): container finished" podID="6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" containerID="32cea80d998ff921900ecad1ba3e95a637e9903607284e452d943be681cbee64" exitCode=0 Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.901958 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" event={"ID":"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff","Type":"ContainerDied","Data":"32cea80d998ff921900ecad1ba3e95a637e9903607284e452d943be681cbee64"} Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.906292 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23","Type":"ContainerStarted","Data":"33d58eb8fe3bcc9637436b89f13bb4d5cd715e800a5536cbd0ad491cb57a007a"} Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.918947 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" event={"ID":"ddea5ab9-bac1-479d-ae24-daf4022ce73c","Type":"ContainerDied","Data":"9fc62fd5a01af62a970b6c4092175c78983facff7e78a8bb7518c4d48541b1aa"} Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.919013 4773 scope.go:117] "RemoveContainer" containerID="0e7e78e780978dc11c769c10c1dbb1667d840006a1bc6c4d516bdb2da9625968" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.919179 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-5x7tr" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.920085 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.921099 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.922503 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.956012 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=29.736082745 podStartE2EDuration="1m38.955990868s" podCreationTimestamp="2026-01-21 15:43:26 +0000 UTC" firstStartedPulling="2026-01-21 15:43:27.857544661 +0000 UTC m=+1172.782034283" lastFinishedPulling="2026-01-21 15:44:37.077452784 +0000 UTC m=+1242.001942406" observedRunningTime="2026-01-21 15:45:04.943407233 +0000 UTC m=+1269.867896875" watchObservedRunningTime="2026-01-21 15:45:04.955990868 +0000 UTC m=+1269.880480480" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.972991 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" podStartSLOduration=4.972970003 podStartE2EDuration="4.972970003s" podCreationTimestamp="2026-01-21 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:04.963862204 +0000 UTC m=+1269.888351846" watchObservedRunningTime="2026-01-21 15:45:04.972970003 +0000 UTC m=+1269.897459625" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.980650 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 21 15:45:04 crc kubenswrapper[4773]: I0121 15:45:04.993450 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=79.206556676 podStartE2EDuration="1m30.993434054s" podCreationTimestamp="2026-01-21 15:43:34 +0000 UTC" firstStartedPulling="2026-01-21 15:44:16.862161689 +0000 UTC m=+1221.786651311" lastFinishedPulling="2026-01-21 15:44:28.649039067 +0000 UTC m=+1233.573528689" observedRunningTime="2026-01-21 15:45:04.987037998 +0000 UTC m=+1269.911527620" watchObservedRunningTime="2026-01-21 15:45:04.993434054 +0000 UTC m=+1269.917923676" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.137633 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-5x7tr"] Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.148116 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-5x7tr"] Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.360932 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:45:05 crc kubenswrapper[4773]: E0121 15:45:05.361090 4773 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:45:05 crc kubenswrapper[4773]: E0121 15:45:05.361191 4773 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:45:05 crc kubenswrapper[4773]: E0121 15:45:05.361254 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift podName:111512a9-4e17-4433-a7e9-e8666099d12f nodeName:}" failed. No retries permitted until 2026-01-21 15:45:13.361236709 +0000 UTC m=+1278.285726331 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift") pod "swift-storage-0" (UID: "111512a9-4e17-4433-a7e9-e8666099d12f") : configmap "swift-ring-files" not found Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.403316 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddea5ab9-bac1-479d-ae24-daf4022ce73c" path="/var/lib/kubelet/pods/ddea5ab9-bac1-479d-ae24-daf4022ce73c/volumes" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.500454 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.507082 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.518766 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.666576 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjhhd\" (UniqueName: \"kubernetes.io/projected/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-kube-api-access-hjhhd\") pod \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.666722 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-config\") pod \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.666790 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-dns-svc\") pod \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\" (UID: \"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff\") " Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.666838 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-config\") pod \"e7edb06e-2b04-4d03-b089-4724a466d720\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.666881 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrpmc\" (UniqueName: \"kubernetes.io/projected/e7edb06e-2b04-4d03-b089-4724a466d720-kube-api-access-zrpmc\") pod \"e7edb06e-2b04-4d03-b089-4724a466d720\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.666902 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-dns-svc\") pod \"e7edb06e-2b04-4d03-b089-4724a466d720\" (UID: \"e7edb06e-2b04-4d03-b089-4724a466d720\") " Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.671067 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7edb06e-2b04-4d03-b089-4724a466d720-kube-api-access-zrpmc" (OuterVolumeSpecName: "kube-api-access-zrpmc") pod "e7edb06e-2b04-4d03-b089-4724a466d720" (UID: "e7edb06e-2b04-4d03-b089-4724a466d720"). InnerVolumeSpecName "kube-api-access-zrpmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.687016 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-kube-api-access-hjhhd" (OuterVolumeSpecName: "kube-api-access-hjhhd") pod "6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" (UID: "6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff"). InnerVolumeSpecName "kube-api-access-hjhhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.716847 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7edb06e-2b04-4d03-b089-4724a466d720" (UID: "e7edb06e-2b04-4d03-b089-4724a466d720"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.730399 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-config" (OuterVolumeSpecName: "config") pod "e7edb06e-2b04-4d03-b089-4724a466d720" (UID: "e7edb06e-2b04-4d03-b089-4724a466d720"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.733771 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" (UID: "6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.739270 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-config" (OuterVolumeSpecName: "config") pod "6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" (UID: "6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.769975 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.770019 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrpmc\" (UniqueName: \"kubernetes.io/projected/e7edb06e-2b04-4d03-b089-4724a466d720-kube-api-access-zrpmc\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.770034 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7edb06e-2b04-4d03-b089-4724a466d720-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.770046 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjhhd\" (UniqueName: \"kubernetes.io/projected/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-kube-api-access-hjhhd\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.770058 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.770070 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.929494 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.930601 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4hbk9" event={"ID":"6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff","Type":"ContainerDied","Data":"1ba7156543f7e270f97806dd19cf4e66916aaad3364c5ff0461185b2264aae3e"} Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.930657 4773 scope.go:117] "RemoveContainer" containerID="32cea80d998ff921900ecad1ba3e95a637e9903607284e452d943be681cbee64" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.936200 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" event={"ID":"9c654304-9ce6-4243-9273-bfd23bdc0ac8","Type":"ContainerStarted","Data":"d05ddf62e28b064d42e20a062db48cea031aa8ff37e860eba2040d3e04562539"} Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.938561 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.943648 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-vlcdh" event={"ID":"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd","Type":"ContainerStarted","Data":"a57b464d3678b7bf523449d37eae262c473608887ea71321500fa2c91e35769e"} Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.944239 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.959303 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" podStartSLOduration=8.95928706 podStartE2EDuration="8.95928706s" podCreationTimestamp="2026-01-21 15:44:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:05.957955874 +0000 UTC m=+1270.882445496" watchObservedRunningTime="2026-01-21 15:45:05.95928706 +0000 UTC m=+1270.883776682" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.961971 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" event={"ID":"e7edb06e-2b04-4d03-b089-4724a466d720","Type":"ContainerDied","Data":"8baee2408b6ee9636d6079a2a62c5605cd814aa8dc77d8e0a030ca1dc80c18a0"} Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.962127 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mldv7" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.970529 4773 scope.go:117] "RemoveContainer" containerID="82f759fcde8fbf130ede62373f729e90f94668088a233680bc741fe96fc65f41" Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.973892 4773 generic.go:334] "Generic (PLEG): container finished" podID="527a62a3-540f-4352-903c-184f60e613a7" containerID="349ce91b7ddd85303a5dfbaac856165d8f9cf548169584d19d626175eb2cc750" exitCode=0 Jan 21 15:45:05 crc kubenswrapper[4773]: I0121 15:45:05.974527 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" event={"ID":"527a62a3-540f-4352-903c-184f60e613a7","Type":"ContainerDied","Data":"349ce91b7ddd85303a5dfbaac856165d8f9cf548169584d19d626175eb2cc750"} Jan 21 15:45:06 crc kubenswrapper[4773]: I0121 15:45:06.010077 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4hbk9"] Jan 21 15:45:06 crc kubenswrapper[4773]: I0121 15:45:06.011045 4773 scope.go:117] "RemoveContainer" containerID="2699d2e9d1e6fcda87df145aa3deda82a85b1086c240c11dd4d40d418b1e0ead" Jan 21 15:45:06 crc kubenswrapper[4773]: I0121 15:45:06.022972 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4hbk9"] Jan 21 15:45:06 crc kubenswrapper[4773]: I0121 15:45:06.044894 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-vlcdh" podStartSLOduration=9.044876874 podStartE2EDuration="9.044876874s" podCreationTimestamp="2026-01-21 15:44:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:06.009420834 +0000 UTC m=+1270.933910466" watchObservedRunningTime="2026-01-21 15:45:06.044876874 +0000 UTC m=+1270.969366496" Jan 21 15:45:06 crc kubenswrapper[4773]: I0121 15:45:06.062851 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mldv7"] Jan 21 15:45:06 crc kubenswrapper[4773]: I0121 15:45:06.069307 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mldv7"] Jan 21 15:45:06 crc kubenswrapper[4773]: I0121 15:45:06.072599 4773 scope.go:117] "RemoveContainer" containerID="7b4c0d39628d4adc9c6e17e0b7ec02c9bbcd1c5f07d5a898a70a8b53b9a96d66" Jan 21 15:45:06 crc kubenswrapper[4773]: I0121 15:45:06.500156 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 21 15:45:06 crc kubenswrapper[4773]: I0121 15:45:06.544419 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.044975 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.201306 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 21 15:45:07 crc kubenswrapper[4773]: E0121 15:45:07.202501 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddea5ab9-bac1-479d-ae24-daf4022ce73c" containerName="init" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.202530 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddea5ab9-bac1-479d-ae24-daf4022ce73c" containerName="init" Jan 21 15:45:07 crc kubenswrapper[4773]: E0121 15:45:07.202555 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" containerName="init" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.202563 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" containerName="init" Jan 21 15:45:07 crc kubenswrapper[4773]: E0121 15:45:07.202576 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7edb06e-2b04-4d03-b089-4724a466d720" containerName="init" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.202583 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7edb06e-2b04-4d03-b089-4724a466d720" containerName="init" Jan 21 15:45:07 crc kubenswrapper[4773]: E0121 15:45:07.202594 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7edb06e-2b04-4d03-b089-4724a466d720" containerName="dnsmasq-dns" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.202602 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7edb06e-2b04-4d03-b089-4724a466d720" containerName="dnsmasq-dns" Jan 21 15:45:07 crc kubenswrapper[4773]: E0121 15:45:07.202618 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" containerName="dnsmasq-dns" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.202625 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" containerName="dnsmasq-dns" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.225577 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" containerName="dnsmasq-dns" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.225656 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7edb06e-2b04-4d03-b089-4724a466d720" containerName="dnsmasq-dns" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.225675 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddea5ab9-bac1-479d-ae24-daf4022ce73c" containerName="init" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.227017 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.231986 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.241557 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.241887 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.242053 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-6zg55" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.260540 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.319140 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fd966624-79ad-4926-9253-741b8f1e6fe4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.319180 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd966624-79ad-4926-9253-741b8f1e6fe4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.319207 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd966624-79ad-4926-9253-741b8f1e6fe4-scripts\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.319372 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd966624-79ad-4926-9253-741b8f1e6fe4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.319428 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd966624-79ad-4926-9253-741b8f1e6fe4-config\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.319547 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn46h\" (UniqueName: \"kubernetes.io/projected/fd966624-79ad-4926-9253-741b8f1e6fe4-kube-api-access-mn46h\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.319631 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd966624-79ad-4926-9253-741b8f1e6fe4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.401006 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff" path="/var/lib/kubelet/pods/6dbe4227-f12f-4df2-bd66-dbd87ea3a2ff/volumes" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.401747 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7edb06e-2b04-4d03-b089-4724a466d720" path="/var/lib/kubelet/pods/e7edb06e-2b04-4d03-b089-4724a466d720/volumes" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.421418 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn46h\" (UniqueName: \"kubernetes.io/projected/fd966624-79ad-4926-9253-741b8f1e6fe4-kube-api-access-mn46h\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.421495 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd966624-79ad-4926-9253-741b8f1e6fe4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.421583 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd966624-79ad-4926-9253-741b8f1e6fe4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.421601 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fd966624-79ad-4926-9253-741b8f1e6fe4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.421674 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd966624-79ad-4926-9253-741b8f1e6fe4-scripts\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.421768 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd966624-79ad-4926-9253-741b8f1e6fe4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.422316 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fd966624-79ad-4926-9253-741b8f1e6fe4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.421834 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd966624-79ad-4926-9253-741b8f1e6fe4-config\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.423977 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd966624-79ad-4926-9253-741b8f1e6fe4-scripts\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.424232 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd966624-79ad-4926-9253-741b8f1e6fe4-config\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.427578 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd966624-79ad-4926-9253-741b8f1e6fe4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.428500 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd966624-79ad-4926-9253-741b8f1e6fe4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.440007 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd966624-79ad-4926-9253-741b8f1e6fe4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.443105 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn46h\" (UniqueName: \"kubernetes.io/projected/fd966624-79ad-4926-9253-741b8f1e6fe4-kube-api-access-mn46h\") pod \"ovn-northd-0\" (UID: \"fd966624-79ad-4926-9253-741b8f1e6fe4\") " pod="openstack/ovn-northd-0" Jan 21 15:45:07 crc kubenswrapper[4773]: I0121 15:45:07.572068 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 21 15:45:09 crc kubenswrapper[4773]: I0121 15:45:09.949737 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:09 crc kubenswrapper[4773]: I0121 15:45:09.973796 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/527a62a3-540f-4352-903c-184f60e613a7-secret-volume\") pod \"527a62a3-540f-4352-903c-184f60e613a7\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " Jan 21 15:45:09 crc kubenswrapper[4773]: I0121 15:45:09.973870 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/527a62a3-540f-4352-903c-184f60e613a7-config-volume\") pod \"527a62a3-540f-4352-903c-184f60e613a7\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " Jan 21 15:45:09 crc kubenswrapper[4773]: I0121 15:45:09.974109 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgtcw\" (UniqueName: \"kubernetes.io/projected/527a62a3-540f-4352-903c-184f60e613a7-kube-api-access-fgtcw\") pod \"527a62a3-540f-4352-903c-184f60e613a7\" (UID: \"527a62a3-540f-4352-903c-184f60e613a7\") " Jan 21 15:45:09 crc kubenswrapper[4773]: I0121 15:45:09.975390 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/527a62a3-540f-4352-903c-184f60e613a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "527a62a3-540f-4352-903c-184f60e613a7" (UID: "527a62a3-540f-4352-903c-184f60e613a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:09 crc kubenswrapper[4773]: I0121 15:45:09.980934 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527a62a3-540f-4352-903c-184f60e613a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "527a62a3-540f-4352-903c-184f60e613a7" (UID: "527a62a3-540f-4352-903c-184f60e613a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:09 crc kubenswrapper[4773]: I0121 15:45:09.981172 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527a62a3-540f-4352-903c-184f60e613a7-kube-api-access-fgtcw" (OuterVolumeSpecName: "kube-api-access-fgtcw") pod "527a62a3-540f-4352-903c-184f60e613a7" (UID: "527a62a3-540f-4352-903c-184f60e613a7"). InnerVolumeSpecName "kube-api-access-fgtcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:10 crc kubenswrapper[4773]: I0121 15:45:10.034845 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" event={"ID":"527a62a3-540f-4352-903c-184f60e613a7","Type":"ContainerDied","Data":"ecf8f149d00b6f5337a675b63c3cd9b2ec0db7361aac8edb4f40922479d08cd4"} Jan 21 15:45:10 crc kubenswrapper[4773]: I0121 15:45:10.034926 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecf8f149d00b6f5337a675b63c3cd9b2ec0db7361aac8edb4f40922479d08cd4" Jan 21 15:45:10 crc kubenswrapper[4773]: I0121 15:45:10.034971 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96" Jan 21 15:45:10 crc kubenswrapper[4773]: I0121 15:45:10.074897 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zwrfs" podUID="1f582857-cae4-4fa2-896d-b763b224ad8e" containerName="ovn-controller" probeResult="failure" output=< Jan 21 15:45:10 crc kubenswrapper[4773]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 21 15:45:10 crc kubenswrapper[4773]: > Jan 21 15:45:10 crc kubenswrapper[4773]: I0121 15:45:10.077424 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgtcw\" (UniqueName: \"kubernetes.io/projected/527a62a3-540f-4352-903c-184f60e613a7-kube-api-access-fgtcw\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4773]: I0121 15:45:10.077454 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/527a62a3-540f-4352-903c-184f60e613a7-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4773]: I0121 15:45:10.077463 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/527a62a3-540f-4352-903c-184f60e613a7-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:10 crc kubenswrapper[4773]: I0121 15:45:10.277615 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="68c0e8c6-bc28-4101-a1d5-99ce639ae62c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 15:45:10 crc kubenswrapper[4773]: I0121 15:45:10.316612 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 21 15:45:12 crc kubenswrapper[4773]: I0121 15:45:12.051443 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-k6w2q" event={"ID":"626ecec5-3380-45fa-a2b1-248ee0af1328","Type":"ContainerStarted","Data":"70e2879d6173652e2ef42c7fde385758ef6c3dac4ac64258906a0dd1e8e91a16"} Jan 21 15:45:12 crc kubenswrapper[4773]: I0121 15:45:12.052960 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fd966624-79ad-4926-9253-741b8f1e6fe4","Type":"ContainerStarted","Data":"d1f9fd4ef40d899d68ab5a41f86a7fb57d7bc9cd8a2d1423118951cb62f36081"} Jan 21 15:45:12 crc kubenswrapper[4773]: I0121 15:45:12.054970 4773 generic.go:334] "Generic (PLEG): container finished" podID="32888aa3-cb52-484f-9745-5d5dfc5179df" containerID="80b2953f89db8a0aaea03e3ec256b914b62a674cfe4c0a7ffc14f128e4fc91e7" exitCode=0 Jan 21 15:45:12 crc kubenswrapper[4773]: I0121 15:45:12.055007 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32888aa3-cb52-484f-9745-5d5dfc5179df","Type":"ContainerDied","Data":"80b2953f89db8a0aaea03e3ec256b914b62a674cfe4c0a7ffc14f128e4fc91e7"} Jan 21 15:45:12 crc kubenswrapper[4773]: I0121 15:45:12.082805 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-k6w2q" podStartSLOduration=2.19531451 podStartE2EDuration="11.082782102s" podCreationTimestamp="2026-01-21 15:45:01 +0000 UTC" firstStartedPulling="2026-01-21 15:45:02.71557986 +0000 UTC m=+1267.640069482" lastFinishedPulling="2026-01-21 15:45:11.603047452 +0000 UTC m=+1276.527537074" observedRunningTime="2026-01-21 15:45:12.070849215 +0000 UTC m=+1276.995338867" watchObservedRunningTime="2026-01-21 15:45:12.082782102 +0000 UTC m=+1277.007271724" Jan 21 15:45:12 crc kubenswrapper[4773]: I0121 15:45:12.651931 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:45:12 crc kubenswrapper[4773]: I0121 15:45:12.786899 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:45:12 crc kubenswrapper[4773]: I0121 15:45:12.859152 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-rxztr"] Jan 21 15:45:13 crc kubenswrapper[4773]: I0121 15:45:13.068525 4773 generic.go:334] "Generic (PLEG): container finished" podID="869ad9c0-3593-4ebc-9b58-7b9615e46927" containerID="87e6e51434ba11d721cb71c15b5ff6804207bda93141d4700ab8702b602bb459" exitCode=0 Jan 21 15:45:13 crc kubenswrapper[4773]: I0121 15:45:13.068631 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"869ad9c0-3593-4ebc-9b58-7b9615e46927","Type":"ContainerDied","Data":"87e6e51434ba11d721cb71c15b5ff6804207bda93141d4700ab8702b602bb459"} Jan 21 15:45:13 crc kubenswrapper[4773]: I0121 15:45:13.075025 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32888aa3-cb52-484f-9745-5d5dfc5179df","Type":"ContainerStarted","Data":"7f9d93e1b0ce7900e2b51e976d7f174f95e730a8230eee276878803b1e0552f4"} Jan 21 15:45:13 crc kubenswrapper[4773]: I0121 15:45:13.075260 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" podUID="9c654304-9ce6-4243-9273-bfd23bdc0ac8" containerName="dnsmasq-dns" containerID="cri-o://d05ddf62e28b064d42e20a062db48cea031aa8ff37e860eba2040d3e04562539" gracePeriod=10 Jan 21 15:45:13 crc kubenswrapper[4773]: I0121 15:45:13.119231 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=46.526789767 podStartE2EDuration="1m52.119209022s" podCreationTimestamp="2026-01-21 15:43:21 +0000 UTC" firstStartedPulling="2026-01-21 15:43:23.134582187 +0000 UTC m=+1168.059071809" lastFinishedPulling="2026-01-21 15:44:28.727001432 +0000 UTC m=+1233.651491064" observedRunningTime="2026-01-21 15:45:13.10819928 +0000 UTC m=+1278.032688912" watchObservedRunningTime="2026-01-21 15:45:13.119209022 +0000 UTC m=+1278.043698644" Jan 21 15:45:13 crc kubenswrapper[4773]: I0121 15:45:13.448350 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:45:13 crc kubenswrapper[4773]: E0121 15:45:13.448565 4773 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 21 15:45:13 crc kubenswrapper[4773]: E0121 15:45:13.448580 4773 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 21 15:45:13 crc kubenswrapper[4773]: E0121 15:45:13.448624 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift podName:111512a9-4e17-4433-a7e9-e8666099d12f nodeName:}" failed. No retries permitted until 2026-01-21 15:45:29.448611634 +0000 UTC m=+1294.373101246 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift") pod "swift-storage-0" (UID: "111512a9-4e17-4433-a7e9-e8666099d12f") : configmap "swift-ring-files" not found Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.104161 4773 generic.go:334] "Generic (PLEG): container finished" podID="9c654304-9ce6-4243-9273-bfd23bdc0ac8" containerID="d05ddf62e28b064d42e20a062db48cea031aa8ff37e860eba2040d3e04562539" exitCode=0 Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.104468 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" event={"ID":"9c654304-9ce6-4243-9273-bfd23bdc0ac8","Type":"ContainerDied","Data":"d05ddf62e28b064d42e20a062db48cea031aa8ff37e860eba2040d3e04562539"} Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.815350 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.881405 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-config\") pod \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.881491 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-ovsdbserver-sb\") pod \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.881605 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb5hb\" (UniqueName: \"kubernetes.io/projected/9c654304-9ce6-4243-9273-bfd23bdc0ac8-kube-api-access-vb5hb\") pod \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.881671 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-dns-svc\") pod \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\" (UID: \"9c654304-9ce6-4243-9273-bfd23bdc0ac8\") " Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.890304 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c654304-9ce6-4243-9273-bfd23bdc0ac8-kube-api-access-vb5hb" (OuterVolumeSpecName: "kube-api-access-vb5hb") pod "9c654304-9ce6-4243-9273-bfd23bdc0ac8" (UID: "9c654304-9ce6-4243-9273-bfd23bdc0ac8"). InnerVolumeSpecName "kube-api-access-vb5hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.952487 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-config" (OuterVolumeSpecName: "config") pod "9c654304-9ce6-4243-9273-bfd23bdc0ac8" (UID: "9c654304-9ce6-4243-9273-bfd23bdc0ac8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.953236 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c654304-9ce6-4243-9273-bfd23bdc0ac8" (UID: "9c654304-9ce6-4243-9273-bfd23bdc0ac8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.967623 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c654304-9ce6-4243-9273-bfd23bdc0ac8" (UID: "9c654304-9ce6-4243-9273-bfd23bdc0ac8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.984311 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb5hb\" (UniqueName: \"kubernetes.io/projected/9c654304-9ce6-4243-9273-bfd23bdc0ac8-kube-api-access-vb5hb\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.984357 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.984369 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:14 crc kubenswrapper[4773]: I0121 15:45:14.984383 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c654304-9ce6-4243-9273-bfd23bdc0ac8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.059274 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zwrfs" podUID="1f582857-cae4-4fa2-896d-b763b224ad8e" containerName="ovn-controller" probeResult="failure" output=< Jan 21 15:45:15 crc kubenswrapper[4773]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 21 15:45:15 crc kubenswrapper[4773]: > Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.267791 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.268158 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"869ad9c0-3593-4ebc-9b58-7b9615e46927","Type":"ContainerStarted","Data":"b3659126612817e6b5f47b143bbb2aa5a39ee96439e3d9563372dab1e076d2e0"} Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.270818 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fd966624-79ad-4926-9253-741b8f1e6fe4","Type":"ContainerStarted","Data":"99929b172769a3f759fa9f4c08a1d651855809c545ad8b04ee2449bab0af07e2"} Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.270952 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fd966624-79ad-4926-9253-741b8f1e6fe4","Type":"ContainerStarted","Data":"f5397ccf853d507e6b9bdff48acefc586e8a0a55a85b8c49f6cc37e8f94935d8"} Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.271206 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.272598 4773 generic.go:334] "Generic (PLEG): container finished" podID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" containerID="a2d82cd99a3e89e61c6361ec20503a0529ff9b77bb61e768caaf32d1b0602c8e" exitCode=0 Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.272683 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5b1166d-2f9b-452c-a0b2-e7f21998ff45","Type":"ContainerDied","Data":"a2d82cd99a3e89e61c6361ec20503a0529ff9b77bb61e768caaf32d1b0602c8e"} Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.276028 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23","Type":"ContainerStarted","Data":"357714fca181450f1d0365922b37c9854a4d7b09ac685db4ff8ad351120ec2db"} Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.281115 4773 generic.go:334] "Generic (PLEG): container finished" podID="1849053d-528d-42bf-93f3-31cb3ef1c91e" containerID="67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469" exitCode=0 Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.281200 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1849053d-528d-42bf-93f3-31cb3ef1c91e","Type":"ContainerDied","Data":"67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469"} Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.286675 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" event={"ID":"9c654304-9ce6-4243-9273-bfd23bdc0ac8","Type":"ContainerDied","Data":"eaa76630334b7451465f0e1673abbf410718db0fc3b7a50c0bb61c0d61727917"} Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.287034 4773 scope.go:117] "RemoveContainer" containerID="d05ddf62e28b064d42e20a062db48cea031aa8ff37e860eba2040d3e04562539" Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.287216 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.326545 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=48.988697945 podStartE2EDuration="1m53.326527203s" podCreationTimestamp="2026-01-21 15:43:22 +0000 UTC" firstStartedPulling="2026-01-21 15:43:25.048029932 +0000 UTC m=+1169.972519554" lastFinishedPulling="2026-01-21 15:44:29.38585919 +0000 UTC m=+1234.310348812" observedRunningTime="2026-01-21 15:45:15.315912972 +0000 UTC m=+1280.240402594" watchObservedRunningTime="2026-01-21 15:45:15.326527203 +0000 UTC m=+1280.251016825" Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.348583 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=5.393297708 podStartE2EDuration="8.348557917s" podCreationTimestamp="2026-01-21 15:45:07 +0000 UTC" firstStartedPulling="2026-01-21 15:45:11.588798301 +0000 UTC m=+1276.513287923" lastFinishedPulling="2026-01-21 15:45:14.54405851 +0000 UTC m=+1279.468548132" observedRunningTime="2026-01-21 15:45:15.333886904 +0000 UTC m=+1280.258376536" watchObservedRunningTime="2026-01-21 15:45:15.348557917 +0000 UTC m=+1280.273047539" Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.427470 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=33.593987816 podStartE2EDuration="1m49.427450367s" podCreationTimestamp="2026-01-21 15:43:26 +0000 UTC" firstStartedPulling="2026-01-21 15:43:58.711524305 +0000 UTC m=+1203.636013927" lastFinishedPulling="2026-01-21 15:45:14.544986856 +0000 UTC m=+1279.469476478" observedRunningTime="2026-01-21 15:45:15.422471382 +0000 UTC m=+1280.346961024" watchObservedRunningTime="2026-01-21 15:45:15.427450367 +0000 UTC m=+1280.351939989" Jan 21 15:45:15 crc kubenswrapper[4773]: I0121 15:45:15.498982 4773 scope.go:117] "RemoveContainer" containerID="49537f0bec4d3f0c885d9e14674033edaf91b18514cee98b5d240ca40b42c456" Jan 21 15:45:16 crc kubenswrapper[4773]: I0121 15:45:16.311011 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5b1166d-2f9b-452c-a0b2-e7f21998ff45","Type":"ContainerStarted","Data":"759d30338b996eef460469dd5376f15317f00b658a5853172f67b625104e5cee"} Jan 21 15:45:16 crc kubenswrapper[4773]: I0121 15:45:16.311238 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 21 15:45:16 crc kubenswrapper[4773]: I0121 15:45:16.313489 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1849053d-528d-42bf-93f3-31cb3ef1c91e","Type":"ContainerStarted","Data":"b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01"} Jan 21 15:45:16 crc kubenswrapper[4773]: I0121 15:45:16.313746 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:45:16 crc kubenswrapper[4773]: I0121 15:45:16.338275 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.412657412 podStartE2EDuration="1m57.338256446s" podCreationTimestamp="2026-01-21 15:43:19 +0000 UTC" firstStartedPulling="2026-01-21 15:43:21.461386917 +0000 UTC m=+1166.385876539" lastFinishedPulling="2026-01-21 15:44:29.386985951 +0000 UTC m=+1234.311475573" observedRunningTime="2026-01-21 15:45:16.331656875 +0000 UTC m=+1281.256146497" watchObservedRunningTime="2026-01-21 15:45:16.338256446 +0000 UTC m=+1281.262746068" Jan 21 15:45:16 crc kubenswrapper[4773]: I0121 15:45:16.361910 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.740065711 podStartE2EDuration="1m57.361894354s" podCreationTimestamp="2026-01-21 15:43:19 +0000 UTC" firstStartedPulling="2026-01-21 15:43:22.111271017 +0000 UTC m=+1167.035760649" lastFinishedPulling="2026-01-21 15:44:28.73309967 +0000 UTC m=+1233.657589292" observedRunningTime="2026-01-21 15:45:16.357447071 +0000 UTC m=+1281.281936703" watchObservedRunningTime="2026-01-21 15:45:16.361894354 +0000 UTC m=+1281.286383976" Jan 21 15:45:17 crc kubenswrapper[4773]: I0121 15:45:17.602926 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:20 crc kubenswrapper[4773]: I0121 15:45:20.070744 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zwrfs" podUID="1f582857-cae4-4fa2-896d-b763b224ad8e" containerName="ovn-controller" probeResult="failure" output=< Jan 21 15:45:20 crc kubenswrapper[4773]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 21 15:45:20 crc kubenswrapper[4773]: > Jan 21 15:45:20 crc kubenswrapper[4773]: I0121 15:45:20.282831 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="68c0e8c6-bc28-4101-a1d5-99ce639ae62c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 21 15:45:22 crc kubenswrapper[4773]: I0121 15:45:22.361209 4773 generic.go:334] "Generic (PLEG): container finished" podID="626ecec5-3380-45fa-a2b1-248ee0af1328" containerID="70e2879d6173652e2ef42c7fde385758ef6c3dac4ac64258906a0dd1e8e91a16" exitCode=0 Jan 21 15:45:22 crc kubenswrapper[4773]: I0121 15:45:22.361319 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-k6w2q" event={"ID":"626ecec5-3380-45fa-a2b1-248ee0af1328","Type":"ContainerDied","Data":"70e2879d6173652e2ef42c7fde385758ef6c3dac4ac64258906a0dd1e8e91a16"} Jan 21 15:45:22 crc kubenswrapper[4773]: I0121 15:45:22.611541 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 21 15:45:22 crc kubenswrapper[4773]: I0121 15:45:22.611841 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 21 15:45:22 crc kubenswrapper[4773]: I0121 15:45:22.683608 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.455568 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.751250 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1880-account-create-update-jh658"] Jan 21 15:45:23 crc kubenswrapper[4773]: E0121 15:45:23.751660 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c654304-9ce6-4243-9273-bfd23bdc0ac8" containerName="dnsmasq-dns" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.751683 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c654304-9ce6-4243-9273-bfd23bdc0ac8" containerName="dnsmasq-dns" Jan 21 15:45:23 crc kubenswrapper[4773]: E0121 15:45:23.751721 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527a62a3-540f-4352-903c-184f60e613a7" containerName="collect-profiles" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.751731 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="527a62a3-540f-4352-903c-184f60e613a7" containerName="collect-profiles" Jan 21 15:45:23 crc kubenswrapper[4773]: E0121 15:45:23.751753 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c654304-9ce6-4243-9273-bfd23bdc0ac8" containerName="init" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.751761 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c654304-9ce6-4243-9273-bfd23bdc0ac8" containerName="init" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.751962 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="527a62a3-540f-4352-903c-184f60e613a7" containerName="collect-profiles" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.751977 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c654304-9ce6-4243-9273-bfd23bdc0ac8" containerName="dnsmasq-dns" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.752731 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1880-account-create-update-jh658" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.777735 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.781569 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1880-account-create-update-jh658"] Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.834270 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36f06063-64e0-4270-9b6e-258104f23d0a-operator-scripts\") pod \"keystone-1880-account-create-update-jh658\" (UID: \"36f06063-64e0-4270-9b6e-258104f23d0a\") " pod="openstack/keystone-1880-account-create-update-jh658" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.834347 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6ndw\" (UniqueName: \"kubernetes.io/projected/36f06063-64e0-4270-9b6e-258104f23d0a-kube-api-access-n6ndw\") pod \"keystone-1880-account-create-update-jh658\" (UID: \"36f06063-64e0-4270-9b6e-258104f23d0a\") " pod="openstack/keystone-1880-account-create-update-jh658" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.838181 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-hsgk4"] Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.840773 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hsgk4" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.854789 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hsgk4"] Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.912219 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.937943 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36f06063-64e0-4270-9b6e-258104f23d0a-operator-scripts\") pod \"keystone-1880-account-create-update-jh658\" (UID: \"36f06063-64e0-4270-9b6e-258104f23d0a\") " pod="openstack/keystone-1880-account-create-update-jh658" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.938015 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6ndw\" (UniqueName: \"kubernetes.io/projected/36f06063-64e0-4270-9b6e-258104f23d0a-kube-api-access-n6ndw\") pod \"keystone-1880-account-create-update-jh658\" (UID: \"36f06063-64e0-4270-9b6e-258104f23d0a\") " pod="openstack/keystone-1880-account-create-update-jh658" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.938142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2851f3fb-2572-4f19-be86-4771b3b33b06-operator-scripts\") pod \"keystone-db-create-hsgk4\" (UID: \"2851f3fb-2572-4f19-be86-4771b3b33b06\") " pod="openstack/keystone-db-create-hsgk4" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.938309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz9fh\" (UniqueName: \"kubernetes.io/projected/2851f3fb-2572-4f19-be86-4771b3b33b06-kube-api-access-rz9fh\") pod \"keystone-db-create-hsgk4\" (UID: \"2851f3fb-2572-4f19-be86-4771b3b33b06\") " pod="openstack/keystone-db-create-hsgk4" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.938990 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36f06063-64e0-4270-9b6e-258104f23d0a-operator-scripts\") pod \"keystone-1880-account-create-update-jh658\" (UID: \"36f06063-64e0-4270-9b6e-258104f23d0a\") " pod="openstack/keystone-1880-account-create-update-jh658" Jan 21 15:45:23 crc kubenswrapper[4773]: I0121 15:45:23.958434 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6ndw\" (UniqueName: \"kubernetes.io/projected/36f06063-64e0-4270-9b6e-258104f23d0a-kube-api-access-n6ndw\") pod \"keystone-1880-account-create-update-jh658\" (UID: \"36f06063-64e0-4270-9b6e-258104f23d0a\") " pod="openstack/keystone-1880-account-create-update-jh658" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.039140 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x79jj\" (UniqueName: \"kubernetes.io/projected/626ecec5-3380-45fa-a2b1-248ee0af1328-kube-api-access-x79jj\") pod \"626ecec5-3380-45fa-a2b1-248ee0af1328\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.039231 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-dispersionconf\") pod \"626ecec5-3380-45fa-a2b1-248ee0af1328\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.039260 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-combined-ca-bundle\") pod \"626ecec5-3380-45fa-a2b1-248ee0af1328\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.039350 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-swiftconf\") pod \"626ecec5-3380-45fa-a2b1-248ee0af1328\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.039387 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-scripts\") pod \"626ecec5-3380-45fa-a2b1-248ee0af1328\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.039473 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/626ecec5-3380-45fa-a2b1-248ee0af1328-etc-swift\") pod \"626ecec5-3380-45fa-a2b1-248ee0af1328\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.039545 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-ring-data-devices\") pod \"626ecec5-3380-45fa-a2b1-248ee0af1328\" (UID: \"626ecec5-3380-45fa-a2b1-248ee0af1328\") " Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.039829 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2851f3fb-2572-4f19-be86-4771b3b33b06-operator-scripts\") pod \"keystone-db-create-hsgk4\" (UID: \"2851f3fb-2572-4f19-be86-4771b3b33b06\") " pod="openstack/keystone-db-create-hsgk4" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.039951 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz9fh\" (UniqueName: \"kubernetes.io/projected/2851f3fb-2572-4f19-be86-4771b3b33b06-kube-api-access-rz9fh\") pod \"keystone-db-create-hsgk4\" (UID: \"2851f3fb-2572-4f19-be86-4771b3b33b06\") " pod="openstack/keystone-db-create-hsgk4" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.040462 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "626ecec5-3380-45fa-a2b1-248ee0af1328" (UID: "626ecec5-3380-45fa-a2b1-248ee0af1328"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.040843 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/626ecec5-3380-45fa-a2b1-248ee0af1328-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "626ecec5-3380-45fa-a2b1-248ee0af1328" (UID: "626ecec5-3380-45fa-a2b1-248ee0af1328"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.042162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2851f3fb-2572-4f19-be86-4771b3b33b06-operator-scripts\") pod \"keystone-db-create-hsgk4\" (UID: \"2851f3fb-2572-4f19-be86-4771b3b33b06\") " pod="openstack/keystone-db-create-hsgk4" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.065987 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "626ecec5-3380-45fa-a2b1-248ee0af1328" (UID: "626ecec5-3380-45fa-a2b1-248ee0af1328"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.066754 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-49fq5"] Jan 21 15:45:24 crc kubenswrapper[4773]: E0121 15:45:24.067295 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626ecec5-3380-45fa-a2b1-248ee0af1328" containerName="swift-ring-rebalance" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.067321 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="626ecec5-3380-45fa-a2b1-248ee0af1328" containerName="swift-ring-rebalance" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.067560 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="626ecec5-3380-45fa-a2b1-248ee0af1328" containerName="swift-ring-rebalance" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.068407 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-49fq5" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.073129 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/626ecec5-3380-45fa-a2b1-248ee0af1328-kube-api-access-x79jj" (OuterVolumeSpecName: "kube-api-access-x79jj") pod "626ecec5-3380-45fa-a2b1-248ee0af1328" (UID: "626ecec5-3380-45fa-a2b1-248ee0af1328"). InnerVolumeSpecName "kube-api-access-x79jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.083473 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz9fh\" (UniqueName: \"kubernetes.io/projected/2851f3fb-2572-4f19-be86-4771b3b33b06-kube-api-access-rz9fh\") pod \"keystone-db-create-hsgk4\" (UID: \"2851f3fb-2572-4f19-be86-4771b3b33b06\") " pod="openstack/keystone-db-create-hsgk4" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.087781 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1880-account-create-update-jh658" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.092604 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-49fq5"] Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.100222 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "626ecec5-3380-45fa-a2b1-248ee0af1328" (UID: "626ecec5-3380-45fa-a2b1-248ee0af1328"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.120560 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "626ecec5-3380-45fa-a2b1-248ee0af1328" (UID: "626ecec5-3380-45fa-a2b1-248ee0af1328"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.122952 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-scripts" (OuterVolumeSpecName: "scripts") pod "626ecec5-3380-45fa-a2b1-248ee0af1328" (UID: "626ecec5-3380-45fa-a2b1-248ee0af1328"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.146137 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjbw\" (UniqueName: \"kubernetes.io/projected/5feba328-ee96-4d3d-8654-c70374332b17-kube-api-access-pdjbw\") pod \"placement-db-create-49fq5\" (UID: \"5feba328-ee96-4d3d-8654-c70374332b17\") " pod="openstack/placement-db-create-49fq5" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.146334 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5feba328-ee96-4d3d-8654-c70374332b17-operator-scripts\") pod \"placement-db-create-49fq5\" (UID: \"5feba328-ee96-4d3d-8654-c70374332b17\") " pod="openstack/placement-db-create-49fq5" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.148391 4773 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.148429 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.148580 4773 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/626ecec5-3380-45fa-a2b1-248ee0af1328-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.148597 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.148646 4773 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/626ecec5-3380-45fa-a2b1-248ee0af1328-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.148664 4773 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/626ecec5-3380-45fa-a2b1-248ee0af1328-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.148746 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x79jj\" (UniqueName: \"kubernetes.io/projected/626ecec5-3380-45fa-a2b1-248ee0af1328-kube-api-access-x79jj\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.169269 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-836e-account-create-update-vr956"] Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.171552 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-836e-account-create-update-vr956" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.175028 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.182558 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-836e-account-create-update-vr956"] Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.206285 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.210097 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.227933 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hsgk4" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.254315 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjbw\" (UniqueName: \"kubernetes.io/projected/5feba328-ee96-4d3d-8654-c70374332b17-kube-api-access-pdjbw\") pod \"placement-db-create-49fq5\" (UID: \"5feba328-ee96-4d3d-8654-c70374332b17\") " pod="openstack/placement-db-create-49fq5" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.254468 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5feba328-ee96-4d3d-8654-c70374332b17-operator-scripts\") pod \"placement-db-create-49fq5\" (UID: \"5feba328-ee96-4d3d-8654-c70374332b17\") " pod="openstack/placement-db-create-49fq5" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.254603 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gtrs\" (UniqueName: \"kubernetes.io/projected/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-kube-api-access-4gtrs\") pod \"placement-836e-account-create-update-vr956\" (UID: \"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe\") " pod="openstack/placement-836e-account-create-update-vr956" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.254638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-operator-scripts\") pod \"placement-836e-account-create-update-vr956\" (UID: \"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe\") " pod="openstack/placement-836e-account-create-update-vr956" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.256505 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5feba328-ee96-4d3d-8654-c70374332b17-operator-scripts\") pod \"placement-db-create-49fq5\" (UID: \"5feba328-ee96-4d3d-8654-c70374332b17\") " pod="openstack/placement-db-create-49fq5" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.286511 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjbw\" (UniqueName: \"kubernetes.io/projected/5feba328-ee96-4d3d-8654-c70374332b17-kube-api-access-pdjbw\") pod \"placement-db-create-49fq5\" (UID: \"5feba328-ee96-4d3d-8654-c70374332b17\") " pod="openstack/placement-db-create-49fq5" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.356852 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-dp4b6"] Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.357047 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gtrs\" (UniqueName: \"kubernetes.io/projected/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-kube-api-access-4gtrs\") pod \"placement-836e-account-create-update-vr956\" (UID: \"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe\") " pod="openstack/placement-836e-account-create-update-vr956" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.357110 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-operator-scripts\") pod \"placement-836e-account-create-update-vr956\" (UID: \"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe\") " pod="openstack/placement-836e-account-create-update-vr956" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.358156 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-operator-scripts\") pod \"placement-836e-account-create-update-vr956\" (UID: \"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe\") " pod="openstack/placement-836e-account-create-update-vr956" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.358407 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dp4b6" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.376675 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dp4b6"] Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.378396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gtrs\" (UniqueName: \"kubernetes.io/projected/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-kube-api-access-4gtrs\") pod \"placement-836e-account-create-update-vr956\" (UID: \"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe\") " pod="openstack/placement-836e-account-create-update-vr956" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.387112 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-k6w2q" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.387349 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-k6w2q" event={"ID":"626ecec5-3380-45fa-a2b1-248ee0af1328","Type":"ContainerDied","Data":"fd8e7a48bf4ce79e0c7da661b9da8569880f31952aaffc7cf65c50a75c0ca382"} Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.387407 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd8e7a48bf4ce79e0c7da661b9da8569880f31952aaffc7cf65c50a75c0ca382" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.390200 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.458419 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r246r\" (UniqueName: \"kubernetes.io/projected/dd694032-4f44-44b3-b920-83c2cae0bcb5-kube-api-access-r246r\") pod \"glance-db-create-dp4b6\" (UID: \"dd694032-4f44-44b3-b920-83c2cae0bcb5\") " pod="openstack/glance-db-create-dp4b6" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.458870 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd694032-4f44-44b3-b920-83c2cae0bcb5-operator-scripts\") pod \"glance-db-create-dp4b6\" (UID: \"dd694032-4f44-44b3-b920-83c2cae0bcb5\") " pod="openstack/glance-db-create-dp4b6" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.483715 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bba4-account-create-update-d28hs"] Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.485162 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bba4-account-create-update-d28hs" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.488182 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.498789 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bba4-account-create-update-d28hs"] Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.551139 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-49fq5" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.563768 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r246r\" (UniqueName: \"kubernetes.io/projected/dd694032-4f44-44b3-b920-83c2cae0bcb5-kube-api-access-r246r\") pod \"glance-db-create-dp4b6\" (UID: \"dd694032-4f44-44b3-b920-83c2cae0bcb5\") " pod="openstack/glance-db-create-dp4b6" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.563902 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd694032-4f44-44b3-b920-83c2cae0bcb5-operator-scripts\") pod \"glance-db-create-dp4b6\" (UID: \"dd694032-4f44-44b3-b920-83c2cae0bcb5\") " pod="openstack/glance-db-create-dp4b6" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.564911 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd694032-4f44-44b3-b920-83c2cae0bcb5-operator-scripts\") pod \"glance-db-create-dp4b6\" (UID: \"dd694032-4f44-44b3-b920-83c2cae0bcb5\") " pod="openstack/glance-db-create-dp4b6" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.576507 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-836e-account-create-update-vr956" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.587024 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r246r\" (UniqueName: \"kubernetes.io/projected/dd694032-4f44-44b3-b920-83c2cae0bcb5-kube-api-access-r246r\") pod \"glance-db-create-dp4b6\" (UID: \"dd694032-4f44-44b3-b920-83c2cae0bcb5\") " pod="openstack/glance-db-create-dp4b6" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.612803 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1880-account-create-update-jh658"] Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.667459 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-operator-scripts\") pod \"glance-bba4-account-create-update-d28hs\" (UID: \"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3\") " pod="openstack/glance-bba4-account-create-update-d28hs" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.667761 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w66f\" (UniqueName: \"kubernetes.io/projected/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-kube-api-access-5w66f\") pod \"glance-bba4-account-create-update-d28hs\" (UID: \"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3\") " pod="openstack/glance-bba4-account-create-update-d28hs" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.679398 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dp4b6" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.770229 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w66f\" (UniqueName: \"kubernetes.io/projected/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-kube-api-access-5w66f\") pod \"glance-bba4-account-create-update-d28hs\" (UID: \"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3\") " pod="openstack/glance-bba4-account-create-update-d28hs" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.770786 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-operator-scripts\") pod \"glance-bba4-account-create-update-d28hs\" (UID: \"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3\") " pod="openstack/glance-bba4-account-create-update-d28hs" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.771986 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-operator-scripts\") pod \"glance-bba4-account-create-update-d28hs\" (UID: \"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3\") " pod="openstack/glance-bba4-account-create-update-d28hs" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.794223 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w66f\" (UniqueName: \"kubernetes.io/projected/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-kube-api-access-5w66f\") pod \"glance-bba4-account-create-update-d28hs\" (UID: \"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3\") " pod="openstack/glance-bba4-account-create-update-d28hs" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.800415 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bba4-account-create-update-d28hs" Jan 21 15:45:24 crc kubenswrapper[4773]: I0121 15:45:24.822220 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hsgk4"] Jan 21 15:45:24 crc kubenswrapper[4773]: W0121 15:45:24.857962 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2851f3fb_2572_4f19_be86_4771b3b33b06.slice/crio-3f891ffde30a0207af6181ffc797564648a61dde115584b28f54d57cc7944a7f WatchSource:0}: Error finding container 3f891ffde30a0207af6181ffc797564648a61dde115584b28f54d57cc7944a7f: Status 404 returned error can't find the container with id 3f891ffde30a0207af6181ffc797564648a61dde115584b28f54d57cc7944a7f Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.067400 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-49fq5"] Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.160773 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zwrfs" podUID="1f582857-cae4-4fa2-896d-b763b224ad8e" containerName="ovn-controller" probeResult="failure" output=< Jan 21 15:45:25 crc kubenswrapper[4773]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 21 15:45:25 crc kubenswrapper[4773]: > Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.196454 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dp4b6"] Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.264306 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-836e-account-create-update-vr956"] Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.421792 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dp4b6" event={"ID":"dd694032-4f44-44b3-b920-83c2cae0bcb5","Type":"ContainerStarted","Data":"21e4abe4432a6e8826dfb1d664f98503ac69872c821ea3482c0f76dd3198dfba"} Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.422568 4773 generic.go:334] "Generic (PLEG): container finished" podID="36f06063-64e0-4270-9b6e-258104f23d0a" containerID="1a0940fd4d2bdac5744603b18f5b76869251a3f3de035474d8f8c25a587b52ac" exitCode=0 Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.422661 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1880-account-create-update-jh658" event={"ID":"36f06063-64e0-4270-9b6e-258104f23d0a","Type":"ContainerDied","Data":"1a0940fd4d2bdac5744603b18f5b76869251a3f3de035474d8f8c25a587b52ac"} Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.422726 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1880-account-create-update-jh658" event={"ID":"36f06063-64e0-4270-9b6e-258104f23d0a","Type":"ContainerStarted","Data":"22bf583b000d99996ffb187a1207f0163f2b8c76cae04be4f48882850fb55e3b"} Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.424397 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-836e-account-create-update-vr956" event={"ID":"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe","Type":"ContainerStarted","Data":"1254d9fc74bdacd6480b5a21e45782c5623cdfe7a6f44b6c8e1666b799730be8"} Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.426289 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-49fq5" event={"ID":"5feba328-ee96-4d3d-8654-c70374332b17","Type":"ContainerStarted","Data":"85434c8e08817287b2779eb6b6f7b88855f3e8c8178232d7dc65641ba461e6f4"} Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.426321 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-49fq5" event={"ID":"5feba328-ee96-4d3d-8654-c70374332b17","Type":"ContainerStarted","Data":"d6bb660b685c62036df418abb7cda83165f479bea363e32f9d66afce2f2b8c99"} Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.435069 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hsgk4" event={"ID":"2851f3fb-2572-4f19-be86-4771b3b33b06","Type":"ContainerStarted","Data":"2b6aff19116882f7e97b35a58aae8ec18f287a2317918a696aa0c51a819cca1b"} Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.435110 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hsgk4" event={"ID":"2851f3fb-2572-4f19-be86-4771b3b33b06","Type":"ContainerStarted","Data":"3f891ffde30a0207af6181ffc797564648a61dde115584b28f54d57cc7944a7f"} Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.514323 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-49fq5" podStartSLOduration=1.5143015979999999 podStartE2EDuration="1.514301598s" podCreationTimestamp="2026-01-21 15:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:25.507544133 +0000 UTC m=+1290.432033755" watchObservedRunningTime="2026-01-21 15:45:25.514301598 +0000 UTC m=+1290.438791220" Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.535627 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-hsgk4" podStartSLOduration=2.535601782 podStartE2EDuration="2.535601782s" podCreationTimestamp="2026-01-21 15:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:25.525582647 +0000 UTC m=+1290.450072259" watchObservedRunningTime="2026-01-21 15:45:25.535601782 +0000 UTC m=+1290.460091394" Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.588833 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bba4-account-create-update-d28hs"] Jan 21 15:45:25 crc kubenswrapper[4773]: I0121 15:45:25.608560 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.444203 4773 generic.go:334] "Generic (PLEG): container finished" podID="5feba328-ee96-4d3d-8654-c70374332b17" containerID="85434c8e08817287b2779eb6b6f7b88855f3e8c8178232d7dc65641ba461e6f4" exitCode=0 Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.444256 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-49fq5" event={"ID":"5feba328-ee96-4d3d-8654-c70374332b17","Type":"ContainerDied","Data":"85434c8e08817287b2779eb6b6f7b88855f3e8c8178232d7dc65641ba461e6f4"} Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.445968 4773 generic.go:334] "Generic (PLEG): container finished" podID="2851f3fb-2572-4f19-be86-4771b3b33b06" containerID="2b6aff19116882f7e97b35a58aae8ec18f287a2317918a696aa0c51a819cca1b" exitCode=0 Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.446065 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hsgk4" event={"ID":"2851f3fb-2572-4f19-be86-4771b3b33b06","Type":"ContainerDied","Data":"2b6aff19116882f7e97b35a58aae8ec18f287a2317918a696aa0c51a819cca1b"} Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.447799 4773 generic.go:334] "Generic (PLEG): container finished" podID="b1c38ae4-a741-4aaf-9f63-9f2384ad30a3" containerID="e6a5c3f3bbc2a1b7ba8ef36d8cbc9676d13cedcfd2f5f436ecd0770f2dfd6731" exitCode=0 Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.447850 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bba4-account-create-update-d28hs" event={"ID":"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3","Type":"ContainerDied","Data":"e6a5c3f3bbc2a1b7ba8ef36d8cbc9676d13cedcfd2f5f436ecd0770f2dfd6731"} Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.447885 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bba4-account-create-update-d28hs" event={"ID":"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3","Type":"ContainerStarted","Data":"cafc4801c238c21f3b7368808f3f24696412c7b74c20854d206066dffeee28c5"} Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.449258 4773 generic.go:334] "Generic (PLEG): container finished" podID="dd694032-4f44-44b3-b920-83c2cae0bcb5" containerID="d3bd8172c31f65cea2e49b259dc8168c9851e33b7b7a1c19002f635a9e1f7d4d" exitCode=0 Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.449322 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dp4b6" event={"ID":"dd694032-4f44-44b3-b920-83c2cae0bcb5","Type":"ContainerDied","Data":"d3bd8172c31f65cea2e49b259dc8168c9851e33b7b7a1c19002f635a9e1f7d4d"} Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.450946 4773 generic.go:334] "Generic (PLEG): container finished" podID="f07cf5f7-a5cc-49e5-a90f-f49a75c395fe" containerID="cbc2707f7fdb0d7f636dc71af38069483b71902b168c3d3bd004923dd2d0ba8c" exitCode=0 Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.451058 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-836e-account-create-update-vr956" event={"ID":"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe","Type":"ContainerDied","Data":"cbc2707f7fdb0d7f636dc71af38069483b71902b168c3d3bd004923dd2d0ba8c"} Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.866235 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1880-account-create-update-jh658" Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.967836 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6ndw\" (UniqueName: \"kubernetes.io/projected/36f06063-64e0-4270-9b6e-258104f23d0a-kube-api-access-n6ndw\") pod \"36f06063-64e0-4270-9b6e-258104f23d0a\" (UID: \"36f06063-64e0-4270-9b6e-258104f23d0a\") " Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.968099 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36f06063-64e0-4270-9b6e-258104f23d0a-operator-scripts\") pod \"36f06063-64e0-4270-9b6e-258104f23d0a\" (UID: \"36f06063-64e0-4270-9b6e-258104f23d0a\") " Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.968744 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36f06063-64e0-4270-9b6e-258104f23d0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36f06063-64e0-4270-9b6e-258104f23d0a" (UID: "36f06063-64e0-4270-9b6e-258104f23d0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:26 crc kubenswrapper[4773]: I0121 15:45:26.975066 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36f06063-64e0-4270-9b6e-258104f23d0a-kube-api-access-n6ndw" (OuterVolumeSpecName: "kube-api-access-n6ndw") pod "36f06063-64e0-4270-9b6e-258104f23d0a" (UID: "36f06063-64e0-4270-9b6e-258104f23d0a"). InnerVolumeSpecName "kube-api-access-n6ndw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:27 crc kubenswrapper[4773]: I0121 15:45:27.070435 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36f06063-64e0-4270-9b6e-258104f23d0a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:27 crc kubenswrapper[4773]: I0121 15:45:27.070483 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6ndw\" (UniqueName: \"kubernetes.io/projected/36f06063-64e0-4270-9b6e-258104f23d0a-kube-api-access-n6ndw\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:27 crc kubenswrapper[4773]: I0121 15:45:27.461915 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1880-account-create-update-jh658" event={"ID":"36f06063-64e0-4270-9b6e-258104f23d0a","Type":"ContainerDied","Data":"22bf583b000d99996ffb187a1207f0163f2b8c76cae04be4f48882850fb55e3b"} Jan 21 15:45:27 crc kubenswrapper[4773]: I0121 15:45:27.462250 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22bf583b000d99996ffb187a1207f0163f2b8c76cae04be4f48882850fb55e3b" Jan 21 15:45:27 crc kubenswrapper[4773]: I0121 15:45:27.462175 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1880-account-create-update-jh658" Jan 21 15:45:27 crc kubenswrapper[4773]: I0121 15:45:27.600049 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:27 crc kubenswrapper[4773]: I0121 15:45:27.603276 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:27 crc kubenswrapper[4773]: I0121 15:45:27.684752 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 21 15:45:27 crc kubenswrapper[4773]: I0121 15:45:27.907351 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bba4-account-create-update-d28hs" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.106649 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w66f\" (UniqueName: \"kubernetes.io/projected/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-kube-api-access-5w66f\") pod \"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3\" (UID: \"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3\") " Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.106810 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-operator-scripts\") pod \"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3\" (UID: \"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3\") " Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.107599 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1c38ae4-a741-4aaf-9f63-9f2384ad30a3" (UID: "b1c38ae4-a741-4aaf-9f63-9f2384ad30a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.118128 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-kube-api-access-5w66f" (OuterVolumeSpecName: "kube-api-access-5w66f") pod "b1c38ae4-a741-4aaf-9f63-9f2384ad30a3" (UID: "b1c38ae4-a741-4aaf-9f63-9f2384ad30a3"). InnerVolumeSpecName "kube-api-access-5w66f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.209092 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w66f\" (UniqueName: \"kubernetes.io/projected/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-kube-api-access-5w66f\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.209501 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.268013 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-836e-account-create-update-vr956" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.273882 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dp4b6" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.278998 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-49fq5" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.287653 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hsgk4" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.412414 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2851f3fb-2572-4f19-be86-4771b3b33b06-operator-scripts\") pod \"2851f3fb-2572-4f19-be86-4771b3b33b06\" (UID: \"2851f3fb-2572-4f19-be86-4771b3b33b06\") " Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.412464 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-operator-scripts\") pod \"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe\" (UID: \"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe\") " Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.412502 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r246r\" (UniqueName: \"kubernetes.io/projected/dd694032-4f44-44b3-b920-83c2cae0bcb5-kube-api-access-r246r\") pod \"dd694032-4f44-44b3-b920-83c2cae0bcb5\" (UID: \"dd694032-4f44-44b3-b920-83c2cae0bcb5\") " Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.412561 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz9fh\" (UniqueName: \"kubernetes.io/projected/2851f3fb-2572-4f19-be86-4771b3b33b06-kube-api-access-rz9fh\") pod \"2851f3fb-2572-4f19-be86-4771b3b33b06\" (UID: \"2851f3fb-2572-4f19-be86-4771b3b33b06\") " Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.412599 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdjbw\" (UniqueName: \"kubernetes.io/projected/5feba328-ee96-4d3d-8654-c70374332b17-kube-api-access-pdjbw\") pod \"5feba328-ee96-4d3d-8654-c70374332b17\" (UID: \"5feba328-ee96-4d3d-8654-c70374332b17\") " Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.412641 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gtrs\" (UniqueName: \"kubernetes.io/projected/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-kube-api-access-4gtrs\") pod \"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe\" (UID: \"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe\") " Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.412661 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd694032-4f44-44b3-b920-83c2cae0bcb5-operator-scripts\") pod \"dd694032-4f44-44b3-b920-83c2cae0bcb5\" (UID: \"dd694032-4f44-44b3-b920-83c2cae0bcb5\") " Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.412785 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5feba328-ee96-4d3d-8654-c70374332b17-operator-scripts\") pod \"5feba328-ee96-4d3d-8654-c70374332b17\" (UID: \"5feba328-ee96-4d3d-8654-c70374332b17\") " Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.413666 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5feba328-ee96-4d3d-8654-c70374332b17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5feba328-ee96-4d3d-8654-c70374332b17" (UID: "5feba328-ee96-4d3d-8654-c70374332b17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.414046 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2851f3fb-2572-4f19-be86-4771b3b33b06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2851f3fb-2572-4f19-be86-4771b3b33b06" (UID: "2851f3fb-2572-4f19-be86-4771b3b33b06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.414368 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f07cf5f7-a5cc-49e5-a90f-f49a75c395fe" (UID: "f07cf5f7-a5cc-49e5-a90f-f49a75c395fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.415181 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd694032-4f44-44b3-b920-83c2cae0bcb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd694032-4f44-44b3-b920-83c2cae0bcb5" (UID: "dd694032-4f44-44b3-b920-83c2cae0bcb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.418132 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd694032-4f44-44b3-b920-83c2cae0bcb5-kube-api-access-r246r" (OuterVolumeSpecName: "kube-api-access-r246r") pod "dd694032-4f44-44b3-b920-83c2cae0bcb5" (UID: "dd694032-4f44-44b3-b920-83c2cae0bcb5"). InnerVolumeSpecName "kube-api-access-r246r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.418214 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5feba328-ee96-4d3d-8654-c70374332b17-kube-api-access-pdjbw" (OuterVolumeSpecName: "kube-api-access-pdjbw") pod "5feba328-ee96-4d3d-8654-c70374332b17" (UID: "5feba328-ee96-4d3d-8654-c70374332b17"). InnerVolumeSpecName "kube-api-access-pdjbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.420018 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-kube-api-access-4gtrs" (OuterVolumeSpecName: "kube-api-access-4gtrs") pod "f07cf5f7-a5cc-49e5-a90f-f49a75c395fe" (UID: "f07cf5f7-a5cc-49e5-a90f-f49a75c395fe"). InnerVolumeSpecName "kube-api-access-4gtrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.421676 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2851f3fb-2572-4f19-be86-4771b3b33b06-kube-api-access-rz9fh" (OuterVolumeSpecName: "kube-api-access-rz9fh") pod "2851f3fb-2572-4f19-be86-4771b3b33b06" (UID: "2851f3fb-2572-4f19-be86-4771b3b33b06"). InnerVolumeSpecName "kube-api-access-rz9fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.473206 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-836e-account-create-update-vr956" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.473221 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-836e-account-create-update-vr956" event={"ID":"f07cf5f7-a5cc-49e5-a90f-f49a75c395fe","Type":"ContainerDied","Data":"1254d9fc74bdacd6480b5a21e45782c5623cdfe7a6f44b6c8e1666b799730be8"} Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.473357 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1254d9fc74bdacd6480b5a21e45782c5623cdfe7a6f44b6c8e1666b799730be8" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.475748 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-49fq5" event={"ID":"5feba328-ee96-4d3d-8654-c70374332b17","Type":"ContainerDied","Data":"d6bb660b685c62036df418abb7cda83165f479bea363e32f9d66afce2f2b8c99"} Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.475809 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6bb660b685c62036df418abb7cda83165f479bea363e32f9d66afce2f2b8c99" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.476223 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-49fq5" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.477798 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hsgk4" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.477783 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hsgk4" event={"ID":"2851f3fb-2572-4f19-be86-4771b3b33b06","Type":"ContainerDied","Data":"3f891ffde30a0207af6181ffc797564648a61dde115584b28f54d57cc7944a7f"} Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.477985 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f891ffde30a0207af6181ffc797564648a61dde115584b28f54d57cc7944a7f" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.479420 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bba4-account-create-update-d28hs" event={"ID":"b1c38ae4-a741-4aaf-9f63-9f2384ad30a3","Type":"ContainerDied","Data":"cafc4801c238c21f3b7368808f3f24696412c7b74c20854d206066dffeee28c5"} Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.479450 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cafc4801c238c21f3b7368808f3f24696412c7b74c20854d206066dffeee28c5" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.479457 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bba4-account-create-update-d28hs" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.482952 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dp4b6" event={"ID":"dd694032-4f44-44b3-b920-83c2cae0bcb5","Type":"ContainerDied","Data":"21e4abe4432a6e8826dfb1d664f98503ac69872c821ea3482c0f76dd3198dfba"} Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.482979 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dp4b6" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.482983 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21e4abe4432a6e8826dfb1d664f98503ac69872c821ea3482c0f76dd3198dfba" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.484119 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.522323 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5feba328-ee96-4d3d-8654-c70374332b17-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.522374 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2851f3fb-2572-4f19-be86-4771b3b33b06-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.522392 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.522409 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r246r\" (UniqueName: \"kubernetes.io/projected/dd694032-4f44-44b3-b920-83c2cae0bcb5-kube-api-access-r246r\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.522429 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz9fh\" (UniqueName: \"kubernetes.io/projected/2851f3fb-2572-4f19-be86-4771b3b33b06-kube-api-access-rz9fh\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.522444 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdjbw\" (UniqueName: \"kubernetes.io/projected/5feba328-ee96-4d3d-8654-c70374332b17-kube-api-access-pdjbw\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.522460 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gtrs\" (UniqueName: \"kubernetes.io/projected/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe-kube-api-access-4gtrs\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:28 crc kubenswrapper[4773]: I0121 15:45:28.522477 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd694032-4f44-44b3-b920-83c2cae0bcb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.538140 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.544116 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/111512a9-4e17-4433-a7e9-e8666099d12f-etc-swift\") pod \"swift-storage-0\" (UID: \"111512a9-4e17-4433-a7e9-e8666099d12f\") " pod="openstack/swift-storage-0" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.641741 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.696545 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-7vcf9"] Jan 21 15:45:29 crc kubenswrapper[4773]: E0121 15:45:29.697077 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f06063-64e0-4270-9b6e-258104f23d0a" containerName="mariadb-account-create-update" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697103 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f06063-64e0-4270-9b6e-258104f23d0a" containerName="mariadb-account-create-update" Jan 21 15:45:29 crc kubenswrapper[4773]: E0121 15:45:29.697114 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07cf5f7-a5cc-49e5-a90f-f49a75c395fe" containerName="mariadb-account-create-update" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697123 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07cf5f7-a5cc-49e5-a90f-f49a75c395fe" containerName="mariadb-account-create-update" Jan 21 15:45:29 crc kubenswrapper[4773]: E0121 15:45:29.697135 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2851f3fb-2572-4f19-be86-4771b3b33b06" containerName="mariadb-database-create" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697145 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2851f3fb-2572-4f19-be86-4771b3b33b06" containerName="mariadb-database-create" Jan 21 15:45:29 crc kubenswrapper[4773]: E0121 15:45:29.697157 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd694032-4f44-44b3-b920-83c2cae0bcb5" containerName="mariadb-database-create" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697166 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd694032-4f44-44b3-b920-83c2cae0bcb5" containerName="mariadb-database-create" Jan 21 15:45:29 crc kubenswrapper[4773]: E0121 15:45:29.697181 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5feba328-ee96-4d3d-8654-c70374332b17" containerName="mariadb-database-create" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697189 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5feba328-ee96-4d3d-8654-c70374332b17" containerName="mariadb-database-create" Jan 21 15:45:29 crc kubenswrapper[4773]: E0121 15:45:29.697215 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c38ae4-a741-4aaf-9f63-9f2384ad30a3" containerName="mariadb-account-create-update" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697222 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c38ae4-a741-4aaf-9f63-9f2384ad30a3" containerName="mariadb-account-create-update" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697446 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd694032-4f44-44b3-b920-83c2cae0bcb5" containerName="mariadb-database-create" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697465 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5feba328-ee96-4d3d-8654-c70374332b17" containerName="mariadb-database-create" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697482 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07cf5f7-a5cc-49e5-a90f-f49a75c395fe" containerName="mariadb-account-create-update" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697493 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="36f06063-64e0-4270-9b6e-258104f23d0a" containerName="mariadb-account-create-update" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697505 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2851f3fb-2572-4f19-be86-4771b3b33b06" containerName="mariadb-database-create" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.697514 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c38ae4-a741-4aaf-9f63-9f2384ad30a3" containerName="mariadb-account-create-update" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.698441 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.701069 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-stvz5" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.701282 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.721073 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7vcf9"] Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.845955 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-combined-ca-bundle\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.846003 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-db-sync-config-data\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.846032 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-config-data\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.846131 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2ql8\" (UniqueName: \"kubernetes.io/projected/3ee2313d-678e-487c-a4af-ae303d40bedd-kube-api-access-v2ql8\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.947822 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2ql8\" (UniqueName: \"kubernetes.io/projected/3ee2313d-678e-487c-a4af-ae303d40bedd-kube-api-access-v2ql8\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.947916 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-combined-ca-bundle\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.947950 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-db-sync-config-data\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.947983 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-config-data\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.955187 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-db-sync-config-data\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.955247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-config-data\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.955267 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-combined-ca-bundle\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:29 crc kubenswrapper[4773]: I0121 15:45:29.980633 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2ql8\" (UniqueName: \"kubernetes.io/projected/3ee2313d-678e-487c-a4af-ae303d40bedd-kube-api-access-v2ql8\") pod \"glance-db-sync-7vcf9\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:30 crc kubenswrapper[4773]: I0121 15:45:30.076941 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zwrfs" podUID="1f582857-cae4-4fa2-896d-b763b224ad8e" containerName="ovn-controller" probeResult="failure" output=< Jan 21 15:45:30 crc kubenswrapper[4773]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 21 15:45:30 crc kubenswrapper[4773]: > Jan 21 15:45:30 crc kubenswrapper[4773]: I0121 15:45:30.086658 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7vcf9" Jan 21 15:45:30 crc kubenswrapper[4773]: I0121 15:45:30.276985 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 21 15:45:30 crc kubenswrapper[4773]: I0121 15:45:30.848291 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7vcf9"] Jan 21 15:45:30 crc kubenswrapper[4773]: I0121 15:45:30.856663 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:45:30 crc kubenswrapper[4773]: I0121 15:45:30.960876 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.253981 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6tg4p"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.255861 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6tg4p" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.262254 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6tg4p"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.313319 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6z796"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.314829 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6z796" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.320160 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.325338 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6z796"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.371770 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6x9bw"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.373423 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6x9bw" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.387802 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f238aaa6-9768-4a13-b711-158160bfe40f-operator-scripts\") pod \"barbican-db-create-6tg4p\" (UID: \"f238aaa6-9768-4a13-b711-158160bfe40f\") " pod="openstack/barbican-db-create-6tg4p" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.387894 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84zzv\" (UniqueName: \"kubernetes.io/projected/f238aaa6-9768-4a13-b711-158160bfe40f-kube-api-access-84zzv\") pod \"barbican-db-create-6tg4p\" (UID: \"f238aaa6-9768-4a13-b711-158160bfe40f\") " pod="openstack/barbican-db-create-6tg4p" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.401963 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6x9bw"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.489483 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71edab3a-2ae1-4703-a506-e2a278eb5542-operator-scripts\") pod \"cinder-db-create-6x9bw\" (UID: \"71edab3a-2ae1-4703-a506-e2a278eb5542\") " pod="openstack/cinder-db-create-6x9bw" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.489587 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84zzv\" (UniqueName: \"kubernetes.io/projected/f238aaa6-9768-4a13-b711-158160bfe40f-kube-api-access-84zzv\") pod \"barbican-db-create-6tg4p\" (UID: \"f238aaa6-9768-4a13-b711-158160bfe40f\") " pod="openstack/barbican-db-create-6tg4p" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.489667 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csg6q\" (UniqueName: \"kubernetes.io/projected/035696a6-e39b-48f7-acc6-3dc896cfbec2-kube-api-access-csg6q\") pod \"root-account-create-update-6z796\" (UID: \"035696a6-e39b-48f7-acc6-3dc896cfbec2\") " pod="openstack/root-account-create-update-6z796" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.490020 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035696a6-e39b-48f7-acc6-3dc896cfbec2-operator-scripts\") pod \"root-account-create-update-6z796\" (UID: \"035696a6-e39b-48f7-acc6-3dc896cfbec2\") " pod="openstack/root-account-create-update-6z796" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.490794 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f238aaa6-9768-4a13-b711-158160bfe40f-operator-scripts\") pod \"barbican-db-create-6tg4p\" (UID: \"f238aaa6-9768-4a13-b711-158160bfe40f\") " pod="openstack/barbican-db-create-6tg4p" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.491154 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hmw7\" (UniqueName: \"kubernetes.io/projected/71edab3a-2ae1-4703-a506-e2a278eb5542-kube-api-access-7hmw7\") pod \"cinder-db-create-6x9bw\" (UID: \"71edab3a-2ae1-4703-a506-e2a278eb5542\") " pod="openstack/cinder-db-create-6x9bw" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.491477 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f238aaa6-9768-4a13-b711-158160bfe40f-operator-scripts\") pod \"barbican-db-create-6tg4p\" (UID: \"f238aaa6-9768-4a13-b711-158160bfe40f\") " pod="openstack/barbican-db-create-6tg4p" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.509928 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7vcf9" event={"ID":"3ee2313d-678e-487c-a4af-ae303d40bedd","Type":"ContainerStarted","Data":"806097a4d438ac642a784a61d396a7b7af89579fbbbc6fac52df15ebfce3ae2b"} Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.514474 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84zzv\" (UniqueName: \"kubernetes.io/projected/f238aaa6-9768-4a13-b711-158160bfe40f-kube-api-access-84zzv\") pod \"barbican-db-create-6tg4p\" (UID: \"f238aaa6-9768-4a13-b711-158160bfe40f\") " pod="openstack/barbican-db-create-6tg4p" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.552351 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-fkmzk"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.553806 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-fkmzk" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.578147 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.579279 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6tg4p" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.596263 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-fkmzk"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.597379 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csg6q\" (UniqueName: \"kubernetes.io/projected/035696a6-e39b-48f7-acc6-3dc896cfbec2-kube-api-access-csg6q\") pod \"root-account-create-update-6z796\" (UID: \"035696a6-e39b-48f7-acc6-3dc896cfbec2\") " pod="openstack/root-account-create-update-6z796" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.597451 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035696a6-e39b-48f7-acc6-3dc896cfbec2-operator-scripts\") pod \"root-account-create-update-6z796\" (UID: \"035696a6-e39b-48f7-acc6-3dc896cfbec2\") " pod="openstack/root-account-create-update-6z796" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.597539 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hmw7\" (UniqueName: \"kubernetes.io/projected/71edab3a-2ae1-4703-a506-e2a278eb5542-kube-api-access-7hmw7\") pod \"cinder-db-create-6x9bw\" (UID: \"71edab3a-2ae1-4703-a506-e2a278eb5542\") " pod="openstack/cinder-db-create-6x9bw" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.597565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71edab3a-2ae1-4703-a506-e2a278eb5542-operator-scripts\") pod \"cinder-db-create-6x9bw\" (UID: \"71edab3a-2ae1-4703-a506-e2a278eb5542\") " pod="openstack/cinder-db-create-6x9bw" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.598777 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71edab3a-2ae1-4703-a506-e2a278eb5542-operator-scripts\") pod \"cinder-db-create-6x9bw\" (UID: \"71edab3a-2ae1-4703-a506-e2a278eb5542\") " pod="openstack/cinder-db-create-6x9bw" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.599049 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035696a6-e39b-48f7-acc6-3dc896cfbec2-operator-scripts\") pod \"root-account-create-update-6z796\" (UID: \"035696a6-e39b-48f7-acc6-3dc896cfbec2\") " pod="openstack/root-account-create-update-6z796" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.601453 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.611636 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a51f-account-create-update-5nnlx"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.613262 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a51f-account-create-update-5nnlx" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.620354 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.638275 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a51f-account-create-update-5nnlx"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.671642 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hmw7\" (UniqueName: \"kubernetes.io/projected/71edab3a-2ae1-4703-a506-e2a278eb5542-kube-api-access-7hmw7\") pod \"cinder-db-create-6x9bw\" (UID: \"71edab3a-2ae1-4703-a506-e2a278eb5542\") " pod="openstack/cinder-db-create-6x9bw" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.672276 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csg6q\" (UniqueName: \"kubernetes.io/projected/035696a6-e39b-48f7-acc6-3dc896cfbec2-kube-api-access-csg6q\") pod \"root-account-create-update-6z796\" (UID: \"035696a6-e39b-48f7-acc6-3dc896cfbec2\") " pod="openstack/root-account-create-update-6z796" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.674226 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6z796" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.691744 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6x9bw" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.700575 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2gp9\" (UniqueName: \"kubernetes.io/projected/1ab9197c-22f5-484b-b154-df64f7433d7d-kube-api-access-p2gp9\") pod \"cloudkitty-db-create-fkmzk\" (UID: \"1ab9197c-22f5-484b-b154-df64f7433d7d\") " pod="openstack/cloudkitty-db-create-fkmzk" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.700856 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ab9197c-22f5-484b-b154-df64f7433d7d-operator-scripts\") pod \"cloudkitty-db-create-fkmzk\" (UID: \"1ab9197c-22f5-484b-b154-df64f7433d7d\") " pod="openstack/cloudkitty-db-create-fkmzk" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.703293 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-141e-account-create-update-z9rr9"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.704760 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-141e-account-create-update-z9rr9" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.713433 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.781011 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-141e-account-create-update-z9rr9"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.804439 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ab9197c-22f5-484b-b154-df64f7433d7d-operator-scripts\") pod \"cloudkitty-db-create-fkmzk\" (UID: \"1ab9197c-22f5-484b-b154-df64f7433d7d\") " pod="openstack/cloudkitty-db-create-fkmzk" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.804611 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2gp9\" (UniqueName: \"kubernetes.io/projected/1ab9197c-22f5-484b-b154-df64f7433d7d-kube-api-access-p2gp9\") pod \"cloudkitty-db-create-fkmzk\" (UID: \"1ab9197c-22f5-484b-b154-df64f7433d7d\") " pod="openstack/cloudkitty-db-create-fkmzk" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.805144 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48l5l\" (UniqueName: \"kubernetes.io/projected/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-kube-api-access-48l5l\") pod \"cinder-a51f-account-create-update-5nnlx\" (UID: \"46e701e1-3d35-4188-9fe4-8e25b7e0e99e\") " pod="openstack/cinder-a51f-account-create-update-5nnlx" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.805315 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-operator-scripts\") pod \"cinder-a51f-account-create-update-5nnlx\" (UID: \"46e701e1-3d35-4188-9fe4-8e25b7e0e99e\") " pod="openstack/cinder-a51f-account-create-update-5nnlx" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.806126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ab9197c-22f5-484b-b154-df64f7433d7d-operator-scripts\") pod \"cloudkitty-db-create-fkmzk\" (UID: \"1ab9197c-22f5-484b-b154-df64f7433d7d\") " pod="openstack/cloudkitty-db-create-fkmzk" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.847891 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.848241 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="prometheus" containerID="cri-o://bd218997989eb32d58d9b1df12ca198f1dceb6fe8b614b283eaafe8ec0424eb7" gracePeriod=600 Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.849019 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="config-reloader" containerID="cri-o://33d58eb8fe3bcc9637436b89f13bb4d5cd715e800a5536cbd0ad491cb57a007a" gracePeriod=600 Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.849120 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="thanos-sidecar" containerID="cri-o://357714fca181450f1d0365922b37c9854a4d7b09ac685db4ff8ad351120ec2db" gracePeriod=600 Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.850471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2gp9\" (UniqueName: \"kubernetes.io/projected/1ab9197c-22f5-484b-b154-df64f7433d7d-kube-api-access-p2gp9\") pod \"cloudkitty-db-create-fkmzk\" (UID: \"1ab9197c-22f5-484b-b154-df64f7433d7d\") " pod="openstack/cloudkitty-db-create-fkmzk" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.909603 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30fc1d3b-2e18-449e-87be-3cab6a8668a1-operator-scripts\") pod \"barbican-141e-account-create-update-z9rr9\" (UID: \"30fc1d3b-2e18-449e-87be-3cab6a8668a1\") " pod="openstack/barbican-141e-account-create-update-z9rr9" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.910140 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48l5l\" (UniqueName: \"kubernetes.io/projected/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-kube-api-access-48l5l\") pod \"cinder-a51f-account-create-update-5nnlx\" (UID: \"46e701e1-3d35-4188-9fe4-8e25b7e0e99e\") " pod="openstack/cinder-a51f-account-create-update-5nnlx" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.910461 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-operator-scripts\") pod \"cinder-a51f-account-create-update-5nnlx\" (UID: \"46e701e1-3d35-4188-9fe4-8e25b7e0e99e\") " pod="openstack/cinder-a51f-account-create-update-5nnlx" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.911802 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-operator-scripts\") pod \"cinder-a51f-account-create-update-5nnlx\" (UID: \"46e701e1-3d35-4188-9fe4-8e25b7e0e99e\") " pod="openstack/cinder-a51f-account-create-update-5nnlx" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.913996 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlrwp\" (UniqueName: \"kubernetes.io/projected/30fc1d3b-2e18-449e-87be-3cab6a8668a1-kube-api-access-qlrwp\") pod \"barbican-141e-account-create-update-z9rr9\" (UID: \"30fc1d3b-2e18-449e-87be-3cab6a8668a1\") " pod="openstack/barbican-141e-account-create-update-z9rr9" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.947200 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3d4e-account-create-update-zgrzh"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.954355 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3d4e-account-create-update-zgrzh" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.964038 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.968640 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48l5l\" (UniqueName: \"kubernetes.io/projected/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-kube-api-access-48l5l\") pod \"cinder-a51f-account-create-update-5nnlx\" (UID: \"46e701e1-3d35-4188-9fe4-8e25b7e0e99e\") " pod="openstack/cinder-a51f-account-create-update-5nnlx" Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.971128 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3d4e-account-create-update-zgrzh"] Jan 21 15:45:31 crc kubenswrapper[4773]: I0121 15:45:31.999750 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hvg5f"] Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.001804 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hvg5f" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.014911 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hvg5f"] Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.016273 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30fc1d3b-2e18-449e-87be-3cab6a8668a1-operator-scripts\") pod \"barbican-141e-account-create-update-z9rr9\" (UID: \"30fc1d3b-2e18-449e-87be-3cab6a8668a1\") " pod="openstack/barbican-141e-account-create-update-z9rr9" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.016416 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlrwp\" (UniqueName: \"kubernetes.io/projected/30fc1d3b-2e18-449e-87be-3cab6a8668a1-kube-api-access-qlrwp\") pod \"barbican-141e-account-create-update-z9rr9\" (UID: \"30fc1d3b-2e18-449e-87be-3cab6a8668a1\") " pod="openstack/barbican-141e-account-create-update-z9rr9" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.017492 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30fc1d3b-2e18-449e-87be-3cab6a8668a1-operator-scripts\") pod \"barbican-141e-account-create-update-z9rr9\" (UID: \"30fc1d3b-2e18-449e-87be-3cab6a8668a1\") " pod="openstack/barbican-141e-account-create-update-z9rr9" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.051235 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlrwp\" (UniqueName: \"kubernetes.io/projected/30fc1d3b-2e18-449e-87be-3cab6a8668a1-kube-api-access-qlrwp\") pod \"barbican-141e-account-create-update-z9rr9\" (UID: \"30fc1d3b-2e18-449e-87be-3cab6a8668a1\") " pod="openstack/barbican-141e-account-create-update-z9rr9" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.075941 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-fkmzk" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.118281 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e304a8-01b0-46e8-85b0-d06af7a285c6-operator-scripts\") pod \"neutron-db-create-hvg5f\" (UID: \"46e304a8-01b0-46e8-85b0-d06af7a285c6\") " pod="openstack/neutron-db-create-hvg5f" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.118407 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2fcf\" (UniqueName: \"kubernetes.io/projected/ce0bf1bf-486e-40e7-80cb-4eff17210708-kube-api-access-t2fcf\") pod \"neutron-3d4e-account-create-update-zgrzh\" (UID: \"ce0bf1bf-486e-40e7-80cb-4eff17210708\") " pod="openstack/neutron-3d4e-account-create-update-zgrzh" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.118430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bcms\" (UniqueName: \"kubernetes.io/projected/46e304a8-01b0-46e8-85b0-d06af7a285c6-kube-api-access-7bcms\") pod \"neutron-db-create-hvg5f\" (UID: \"46e304a8-01b0-46e8-85b0-d06af7a285c6\") " pod="openstack/neutron-db-create-hvg5f" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.118496 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce0bf1bf-486e-40e7-80cb-4eff17210708-operator-scripts\") pod \"neutron-3d4e-account-create-update-zgrzh\" (UID: \"ce0bf1bf-486e-40e7-80cb-4eff17210708\") " pod="openstack/neutron-3d4e-account-create-update-zgrzh" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.205166 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-8e6c-account-create-update-8642z"] Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.207384 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-8e6c-account-create-update-8642z" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.209572 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a51f-account-create-update-5nnlx" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.210610 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.221303 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce0bf1bf-486e-40e7-80cb-4eff17210708-operator-scripts\") pod \"neutron-3d4e-account-create-update-zgrzh\" (UID: \"ce0bf1bf-486e-40e7-80cb-4eff17210708\") " pod="openstack/neutron-3d4e-account-create-update-zgrzh" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.221428 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e304a8-01b0-46e8-85b0-d06af7a285c6-operator-scripts\") pod \"neutron-db-create-hvg5f\" (UID: \"46e304a8-01b0-46e8-85b0-d06af7a285c6\") " pod="openstack/neutron-db-create-hvg5f" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.221624 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2fcf\" (UniqueName: \"kubernetes.io/projected/ce0bf1bf-486e-40e7-80cb-4eff17210708-kube-api-access-t2fcf\") pod \"neutron-3d4e-account-create-update-zgrzh\" (UID: \"ce0bf1bf-486e-40e7-80cb-4eff17210708\") " pod="openstack/neutron-3d4e-account-create-update-zgrzh" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.221652 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bcms\" (UniqueName: \"kubernetes.io/projected/46e304a8-01b0-46e8-85b0-d06af7a285c6-kube-api-access-7bcms\") pod \"neutron-db-create-hvg5f\" (UID: \"46e304a8-01b0-46e8-85b0-d06af7a285c6\") " pod="openstack/neutron-db-create-hvg5f" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.222763 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e304a8-01b0-46e8-85b0-d06af7a285c6-operator-scripts\") pod \"neutron-db-create-hvg5f\" (UID: \"46e304a8-01b0-46e8-85b0-d06af7a285c6\") " pod="openstack/neutron-db-create-hvg5f" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.223635 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce0bf1bf-486e-40e7-80cb-4eff17210708-operator-scripts\") pod \"neutron-3d4e-account-create-update-zgrzh\" (UID: \"ce0bf1bf-486e-40e7-80cb-4eff17210708\") " pod="openstack/neutron-3d4e-account-create-update-zgrzh" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.248114 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-141e-account-create-update-z9rr9" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.267897 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2fcf\" (UniqueName: \"kubernetes.io/projected/ce0bf1bf-486e-40e7-80cb-4eff17210708-kube-api-access-t2fcf\") pod \"neutron-3d4e-account-create-update-zgrzh\" (UID: \"ce0bf1bf-486e-40e7-80cb-4eff17210708\") " pod="openstack/neutron-3d4e-account-create-update-zgrzh" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.270926 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bcms\" (UniqueName: \"kubernetes.io/projected/46e304a8-01b0-46e8-85b0-d06af7a285c6-kube-api-access-7bcms\") pod \"neutron-db-create-hvg5f\" (UID: \"46e304a8-01b0-46e8-85b0-d06af7a285c6\") " pod="openstack/neutron-db-create-hvg5f" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.280928 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-8e6c-account-create-update-8642z"] Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.331808 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmvrb\" (UniqueName: \"kubernetes.io/projected/e14d0772-1452-4862-ad42-7c992e1bc03a-kube-api-access-fmvrb\") pod \"cloudkitty-8e6c-account-create-update-8642z\" (UID: \"e14d0772-1452-4862-ad42-7c992e1bc03a\") " pod="openstack/cloudkitty-8e6c-account-create-update-8642z" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.331875 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e14d0772-1452-4862-ad42-7c992e1bc03a-operator-scripts\") pod \"cloudkitty-8e6c-account-create-update-8642z\" (UID: \"e14d0772-1452-4862-ad42-7c992e1bc03a\") " pod="openstack/cloudkitty-8e6c-account-create-update-8642z" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.341633 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3d4e-account-create-update-zgrzh" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.372732 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hvg5f" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.433386 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmvrb\" (UniqueName: \"kubernetes.io/projected/e14d0772-1452-4862-ad42-7c992e1bc03a-kube-api-access-fmvrb\") pod \"cloudkitty-8e6c-account-create-update-8642z\" (UID: \"e14d0772-1452-4862-ad42-7c992e1bc03a\") " pod="openstack/cloudkitty-8e6c-account-create-update-8642z" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.433459 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e14d0772-1452-4862-ad42-7c992e1bc03a-operator-scripts\") pod \"cloudkitty-8e6c-account-create-update-8642z\" (UID: \"e14d0772-1452-4862-ad42-7c992e1bc03a\") " pod="openstack/cloudkitty-8e6c-account-create-update-8642z" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.434213 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e14d0772-1452-4862-ad42-7c992e1bc03a-operator-scripts\") pod \"cloudkitty-8e6c-account-create-update-8642z\" (UID: \"e14d0772-1452-4862-ad42-7c992e1bc03a\") " pod="openstack/cloudkitty-8e6c-account-create-update-8642z" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.466707 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmvrb\" (UniqueName: \"kubernetes.io/projected/e14d0772-1452-4862-ad42-7c992e1bc03a-kube-api-access-fmvrb\") pod \"cloudkitty-8e6c-account-create-update-8642z\" (UID: \"e14d0772-1452-4862-ad42-7c992e1bc03a\") " pod="openstack/cloudkitty-8e6c-account-create-update-8642z" Jan 21 15:45:32 crc kubenswrapper[4773]: I0121 15:45:32.491623 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6tg4p"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:32.559193 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-8e6c-account-create-update-8642z" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:32.664293 4773 generic.go:334] "Generic (PLEG): container finished" podID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerID="357714fca181450f1d0365922b37c9854a4d7b09ac685db4ff8ad351120ec2db" exitCode=0 Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:32.664323 4773 generic.go:334] "Generic (PLEG): container finished" podID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerID="33d58eb8fe3bcc9637436b89f13bb4d5cd715e800a5536cbd0ad491cb57a007a" exitCode=0 Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:32.664330 4773 generic.go:334] "Generic (PLEG): container finished" podID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerID="bd218997989eb32d58d9b1df12ca198f1dceb6fe8b614b283eaafe8ec0424eb7" exitCode=0 Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:32.664400 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23","Type":"ContainerDied","Data":"357714fca181450f1d0365922b37c9854a4d7b09ac685db4ff8ad351120ec2db"} Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:32.664427 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23","Type":"ContainerDied","Data":"33d58eb8fe3bcc9637436b89f13bb4d5cd715e800a5536cbd0ad491cb57a007a"} Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:32.664439 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23","Type":"ContainerDied","Data":"bd218997989eb32d58d9b1df12ca198f1dceb6fe8b614b283eaafe8ec0424eb7"} Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:32.667959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"768fa06d2e1fe5aabcdc1d0f18a5b0bc78c64bbbc8cda0ecbe59718cfaf99ca8"} Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:32.739687 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6x9bw"] Jan 21 15:45:34 crc kubenswrapper[4773]: W0121 15:45:32.814295 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71edab3a_2ae1_4703_a506_e2a278eb5542.slice/crio-4eab1ae0c60947065b19918e31b89c881014570f29df2f718a0e2598e7c2ca78 WatchSource:0}: Error finding container 4eab1ae0c60947065b19918e31b89c881014570f29df2f718a0e2598e7c2ca78: Status 404 returned error can't find the container with id 4eab1ae0c60947065b19918e31b89c881014570f29df2f718a0e2598e7c2ca78 Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:32.929933 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.063796 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-tls-assets\") pod \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.063992 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65\") pod \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.064118 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-0\") pod \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.064149 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-thanos-prometheus-http-client-file\") pod \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.064212 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config-out\") pod \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.064243 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-1\") pod \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.064279 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hwjm\" (UniqueName: \"kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-kube-api-access-6hwjm\") pod \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.064355 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-web-config\") pod \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.064861 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" (UID: "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.064873 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" (UID: "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.065203 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config\") pod \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.065256 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-2\") pod \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\" (UID: \"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23\") " Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.065934 4773 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.065955 4773 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.066416 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" (UID: "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.071717 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config-out" (OuterVolumeSpecName: "config-out") pod "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" (UID: "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.072334 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" (UID: "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.073345 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-kube-api-access-6hwjm" (OuterVolumeSpecName: "kube-api-access-6hwjm") pod "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" (UID: "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23"). InnerVolumeSpecName "kube-api-access-6hwjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.074921 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config" (OuterVolumeSpecName: "config") pod "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" (UID: "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.079926 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" (UID: "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.116240 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" (UID: "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23"). InnerVolumeSpecName "pvc-548c9d94-c024-405f-acd8-2291e57bfa65". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.118285 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-web-config" (OuterVolumeSpecName: "web-config") pod "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" (UID: "4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.168400 4773 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config-out\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.169020 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hwjm\" (UniqueName: \"kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-kube-api-access-6hwjm\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.169038 4773 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-web-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.169051 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.169064 4773 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.169080 4773 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.169136 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-548c9d94-c024-405f-acd8-2291e57bfa65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65\") on node \"crc\" " Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.169193 4773 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.235725 4773 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.236007 4773 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-548c9d94-c024-405f-acd8-2291e57bfa65" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65") on node "crc" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.271091 4773 reconciler_common.go:293] "Volume detached for volume \"pvc-548c9d94-c024-405f-acd8-2291e57bfa65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.685670 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23","Type":"ContainerDied","Data":"c3783e3f175efc423655266770ef42770d5e9dc7ced78825682381a82ff01b51"} Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.685754 4773 scope.go:117] "RemoveContainer" containerID="357714fca181450f1d0365922b37c9854a4d7b09ac685db4ff8ad351120ec2db" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.685977 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.698264 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6x9bw" event={"ID":"71edab3a-2ae1-4703-a506-e2a278eb5542","Type":"ContainerStarted","Data":"b08ddcaa57f63148854f8a8f67ba469dd07c4ae0cfa306ebe0ef01dbf0eee70e"} Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.698301 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6x9bw" event={"ID":"71edab3a-2ae1-4703-a506-e2a278eb5542","Type":"ContainerStarted","Data":"4eab1ae0c60947065b19918e31b89c881014570f29df2f718a0e2598e7c2ca78"} Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.700978 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6tg4p" event={"ID":"f238aaa6-9768-4a13-b711-158160bfe40f","Type":"ContainerStarted","Data":"2c426a0de209f703f1ba7f615e457ed31887ba833d4198825084bde376ad6015"} Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.700997 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6tg4p" event={"ID":"f238aaa6-9768-4a13-b711-158160bfe40f","Type":"ContainerStarted","Data":"db57d3f30fc9393a1341a6b68de62eb83cf69e3b71f588a0513953f0408dc7b4"} Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.725337 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-6x9bw" podStartSLOduration=2.725318221 podStartE2EDuration="2.725318221s" podCreationTimestamp="2026-01-21 15:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:33.722757251 +0000 UTC m=+1298.647246873" watchObservedRunningTime="2026-01-21 15:45:33.725318221 +0000 UTC m=+1298.649807843" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.782098 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-6tg4p" podStartSLOduration=2.782067885 podStartE2EDuration="2.782067885s" podCreationTimestamp="2026-01-21 15:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:33.748168427 +0000 UTC m=+1298.672658049" watchObservedRunningTime="2026-01-21 15:45:33.782067885 +0000 UTC m=+1298.706557507" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.821606 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.832279 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.844887 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 21 15:45:34 crc kubenswrapper[4773]: E0121 15:45:33.845439 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="init-config-reloader" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.845458 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="init-config-reloader" Jan 21 15:45:34 crc kubenswrapper[4773]: E0121 15:45:33.845484 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="prometheus" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.845492 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="prometheus" Jan 21 15:45:34 crc kubenswrapper[4773]: E0121 15:45:33.845506 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="thanos-sidecar" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.845514 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="thanos-sidecar" Jan 21 15:45:34 crc kubenswrapper[4773]: E0121 15:45:33.845530 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="config-reloader" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.845538 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="config-reloader" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.845809 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="config-reloader" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.845822 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="thanos-sidecar" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.845849 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="prometheus" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.848469 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.854399 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.854653 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-tgvvj" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.854788 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.854935 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.855015 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.855111 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.855470 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.855644 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.862307 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.862559 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989408 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989468 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989507 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxqp\" (UniqueName: \"kubernetes.io/projected/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-kube-api-access-mnxqp\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989550 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989595 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989725 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-548c9d94-c024-405f-acd8-2291e57bfa65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989774 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989875 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989894 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989923 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:33.989984 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-config\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092296 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092606 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxqp\" (UniqueName: \"kubernetes.io/projected/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-kube-api-access-mnxqp\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092637 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092666 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092740 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-548c9d94-c024-405f-acd8-2291e57bfa65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092776 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092799 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092818 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092849 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092866 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092890 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092939 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-config\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.092979 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.093728 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.094486 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.095027 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.102919 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.105655 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.111751 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.111833 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.113428 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.114782 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.114929 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.114960 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-548c9d94-c024-405f-acd8-2291e57bfa65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7e54ebd5078dca103e79dcb1052666590d0c6b422a63be683d9d63cec387cb90/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.115706 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.115793 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-config\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.126995 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxqp\" (UniqueName: \"kubernetes.io/projected/16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8-kube-api-access-mnxqp\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.171479 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-548c9d94-c024-405f-acd8-2291e57bfa65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-548c9d94-c024-405f-acd8-2291e57bfa65\") pod \"prometheus-metric-storage-0\" (UID: \"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8\") " pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.180236 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.306510 4773 scope.go:117] "RemoveContainer" containerID="33d58eb8fe3bcc9637436b89f13bb4d5cd715e800a5536cbd0ad491cb57a007a" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.395564 4773 scope.go:117] "RemoveContainer" containerID="bd218997989eb32d58d9b1df12ca198f1dceb6fe8b614b283eaafe8ec0424eb7" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.447255 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jdjh8"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.449686 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.458753 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sf6hv" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.458778 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.458963 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.462946 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.468597 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jdjh8"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.491199 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-fkmzk"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.513235 4773 scope.go:117] "RemoveContainer" containerID="45cfcc11228d723dedf07ab470919a57d8b9a7369e6e6b83008e0cd54ea3512d" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.533601 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6z796"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.620734 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-combined-ca-bundle\") pod \"keystone-db-sync-jdjh8\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.620873 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crbvd\" (UniqueName: \"kubernetes.io/projected/a667cd6a-52e3-4221-914f-c662638460d4-kube-api-access-crbvd\") pod \"keystone-db-sync-jdjh8\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.620903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-config-data\") pod \"keystone-db-sync-jdjh8\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.626810 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a51f-account-create-update-5nnlx"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.644433 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-141e-account-create-update-z9rr9"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.723043 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crbvd\" (UniqueName: \"kubernetes.io/projected/a667cd6a-52e3-4221-914f-c662638460d4-kube-api-access-crbvd\") pod \"keystone-db-sync-jdjh8\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.724633 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-config-data\") pod \"keystone-db-sync-jdjh8\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.725039 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-combined-ca-bundle\") pod \"keystone-db-sync-jdjh8\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.731579 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-combined-ca-bundle\") pod \"keystone-db-sync-jdjh8\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.732686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-config-data\") pod \"keystone-db-sync-jdjh8\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.747146 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crbvd\" (UniqueName: \"kubernetes.io/projected/a667cd6a-52e3-4221-914f-c662638460d4-kube-api-access-crbvd\") pod \"keystone-db-sync-jdjh8\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.828395 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.874296 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-8e6c-account-create-update-8642z"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.910109 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3d4e-account-create-update-zgrzh"] Jan 21 15:45:34 crc kubenswrapper[4773]: I0121 15:45:34.923652 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hvg5f"] Jan 21 15:45:35 crc kubenswrapper[4773]: W0121 15:45:35.008514 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46e701e1_3d35_4188_9fe4_8e25b7e0e99e.slice/crio-382e49b6d26e97961c71855fc55c0ef736c0c70b3630fb2e5834805fda1f29b5 WatchSource:0}: Error finding container 382e49b6d26e97961c71855fc55c0ef736c0c70b3630fb2e5834805fda1f29b5: Status 404 returned error can't find the container with id 382e49b6d26e97961c71855fc55c0ef736c0c70b3630fb2e5834805fda1f29b5 Jan 21 15:45:35 crc kubenswrapper[4773]: W0121 15:45:35.016824 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30fc1d3b_2e18_449e_87be_3cab6a8668a1.slice/crio-146f8a2bbb6509b93e6c2ea4f3c8d49f206ac8f627ca835581b696aaf4291322 WatchSource:0}: Error finding container 146f8a2bbb6509b93e6c2ea4f3c8d49f206ac8f627ca835581b696aaf4291322: Status 404 returned error can't find the container with id 146f8a2bbb6509b93e6c2ea4f3c8d49f206ac8f627ca835581b696aaf4291322 Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.102616 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zwrfs" podUID="1f582857-cae4-4fa2-896d-b763b224ad8e" containerName="ovn-controller" probeResult="failure" output=< Jan 21 15:45:35 crc kubenswrapper[4773]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 21 15:45:35 crc kubenswrapper[4773]: > Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.114965 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mvwkv" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.176201 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 21 15:45:35 crc kubenswrapper[4773]: W0121 15:45:35.179845 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16ae82b0_cc5c_4e5d_9f49_55a813ffbfe8.slice/crio-e35e211d3eee7c18ba9584506946dd9dacf390cea9cc4fecbdddafc221dd2fea WatchSource:0}: Error finding container e35e211d3eee7c18ba9584506946dd9dacf390cea9cc4fecbdddafc221dd2fea: Status 404 returned error can't find the container with id e35e211d3eee7c18ba9584506946dd9dacf390cea9cc4fecbdddafc221dd2fea Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.357377 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zwrfs-config-nvbr6"] Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.359672 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.364665 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.376251 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zwrfs-config-nvbr6"] Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.432474 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" path="/var/lib/kubelet/pods/4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23/volumes" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.545599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.546018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-log-ovn\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.549303 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xm89\" (UniqueName: \"kubernetes.io/projected/5dd750c5-d335-4d55-8967-f73c46478364-kube-api-access-7xm89\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.549414 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-additional-scripts\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.549471 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-scripts\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.549525 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run-ovn\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.599253 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="4d5f13ef-7b6b-41eb-b9ea-be9cbcd21a23" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.113:9090/-/ready\": dial tcp 10.217.0.113:9090: i/o timeout (Client.Timeout exceeded while awaiting headers)" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.650940 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-additional-scripts\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.650990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-scripts\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.652035 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-additional-scripts\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.652234 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run-ovn\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.652379 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.652403 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-log-ovn\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.652489 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xm89\" (UniqueName: \"kubernetes.io/projected/5dd750c5-d335-4d55-8967-f73c46478364-kube-api-access-7xm89\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.652716 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run-ovn\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.653426 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-log-ovn\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.653478 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-scripts\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.653501 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.687213 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xm89\" (UniqueName: \"kubernetes.io/projected/5dd750c5-d335-4d55-8967-f73c46478364-kube-api-access-7xm89\") pod \"ovn-controller-zwrfs-config-nvbr6\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.712979 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.772550 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hvg5f" event={"ID":"46e304a8-01b0-46e8-85b0-d06af7a285c6","Type":"ContainerStarted","Data":"e6ff0ba9c4621c53243d857cd3a74921ffa12eaa59cf905287cc928ddffa71e0"} Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.789188 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8","Type":"ContainerStarted","Data":"e35e211d3eee7c18ba9584506946dd9dacf390cea9cc4fecbdddafc221dd2fea"} Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.791130 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-8e6c-account-create-update-8642z" event={"ID":"e14d0772-1452-4862-ad42-7c992e1bc03a","Type":"ContainerStarted","Data":"93cfbd32a7c9f9eecae72d8f640a789593bfd7e11c292594d5c782a0b75349ae"} Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.793072 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-141e-account-create-update-z9rr9" event={"ID":"30fc1d3b-2e18-449e-87be-3cab6a8668a1","Type":"ContainerStarted","Data":"146f8a2bbb6509b93e6c2ea4f3c8d49f206ac8f627ca835581b696aaf4291322"} Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.801123 4773 generic.go:334] "Generic (PLEG): container finished" podID="f238aaa6-9768-4a13-b711-158160bfe40f" containerID="2c426a0de209f703f1ba7f615e457ed31887ba833d4198825084bde376ad6015" exitCode=0 Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.801216 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6tg4p" event={"ID":"f238aaa6-9768-4a13-b711-158160bfe40f","Type":"ContainerDied","Data":"2c426a0de209f703f1ba7f615e457ed31887ba833d4198825084bde376ad6015"} Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.807821 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6z796" event={"ID":"035696a6-e39b-48f7-acc6-3dc896cfbec2","Type":"ContainerStarted","Data":"75518c83389c24ee19ec9e424374f1c6642efedb15a5c8889dd68b0b5e1b7906"} Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.816737 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3d4e-account-create-update-zgrzh" event={"ID":"ce0bf1bf-486e-40e7-80cb-4eff17210708","Type":"ContainerStarted","Data":"f21bfbca86763c1a7e063002064ab46c706e14d4a8f60af97e328e92cd6577a4"} Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.837305 4773 generic.go:334] "Generic (PLEG): container finished" podID="71edab3a-2ae1-4703-a506-e2a278eb5542" containerID="b08ddcaa57f63148854f8a8f67ba469dd07c4ae0cfa306ebe0ef01dbf0eee70e" exitCode=0 Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.837496 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6x9bw" event={"ID":"71edab3a-2ae1-4703-a506-e2a278eb5542","Type":"ContainerDied","Data":"b08ddcaa57f63148854f8a8f67ba469dd07c4ae0cfa306ebe0ef01dbf0eee70e"} Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.850819 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-fkmzk" event={"ID":"1ab9197c-22f5-484b-b154-df64f7433d7d","Type":"ContainerStarted","Data":"4c1452100bcb6bdb4f430b9186cf6bfb765c540c6e7814100e6002f8eef63fa8"} Jan 21 15:45:35 crc kubenswrapper[4773]: I0121 15:45:35.852827 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a51f-account-create-update-5nnlx" event={"ID":"46e701e1-3d35-4188-9fe4-8e25b7e0e99e","Type":"ContainerStarted","Data":"382e49b6d26e97961c71855fc55c0ef736c0c70b3630fb2e5834805fda1f29b5"} Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.000611 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jdjh8"] Jan 21 15:45:36 crc kubenswrapper[4773]: W0121 15:45:36.008821 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda667cd6a_52e3_4221_914f_c662638460d4.slice/crio-0b58605147550eaa1ec4e4a16e1a39556bd4a46c0a9ca61fdf3dc1fd8bfefc08 WatchSource:0}: Error finding container 0b58605147550eaa1ec4e4a16e1a39556bd4a46c0a9ca61fdf3dc1fd8bfefc08: Status 404 returned error can't find the container with id 0b58605147550eaa1ec4e4a16e1a39556bd4a46c0a9ca61fdf3dc1fd8bfefc08 Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.369593 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zwrfs-config-nvbr6"] Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.887012 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-8e6c-account-create-update-8642z" event={"ID":"e14d0772-1452-4862-ad42-7c992e1bc03a","Type":"ContainerStarted","Data":"8adec2fc1a6166c1ee1684c9fdb6cd207fdbc00362a29c3fe4922b7abd97de24"} Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.905415 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"b20ecdc37a00643e8e2420fead719f2c0878b2c52a05052e3fd2e5b529dcbf6b"} Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.915307 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-8e6c-account-create-update-8642z" podStartSLOduration=4.915287809 podStartE2EDuration="4.915287809s" podCreationTimestamp="2026-01-21 15:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:36.912161733 +0000 UTC m=+1301.836651355" watchObservedRunningTime="2026-01-21 15:45:36.915287809 +0000 UTC m=+1301.839777431" Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.922400 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-fkmzk" event={"ID":"1ab9197c-22f5-484b-b154-df64f7433d7d","Type":"ContainerStarted","Data":"a295ec39e5596e593ac63cfb20df027f6dca2405c2a6623e54ad5399a7982e12"} Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.934910 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6z796" event={"ID":"035696a6-e39b-48f7-acc6-3dc896cfbec2","Type":"ContainerStarted","Data":"a5b365f3f16fc1d1d7ebc437e17f6ecd97b9068825e8729cf40a028822d55544"} Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.938338 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zwrfs-config-nvbr6" event={"ID":"5dd750c5-d335-4d55-8967-f73c46478364","Type":"ContainerStarted","Data":"9b2cc654246f0cfd822be786654960dbe7f874a654a5d7e67bdc4096f102f446"} Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.940600 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3d4e-account-create-update-zgrzh" event={"ID":"ce0bf1bf-486e-40e7-80cb-4eff17210708","Type":"ContainerStarted","Data":"8d02e47ccecc0ffee1f8917b3077744d2c6dff83cf1b69b4bf6fb3c17250ac31"} Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.951966 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hvg5f" event={"ID":"46e304a8-01b0-46e8-85b0-d06af7a285c6","Type":"ContainerStarted","Data":"aa7619e4501048b3f9fc1c5340210df2fd925689278c2a084decd5c5c22e83df"} Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.954376 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a51f-account-create-update-5nnlx" event={"ID":"46e701e1-3d35-4188-9fe4-8e25b7e0e99e","Type":"ContainerStarted","Data":"dbc0f2ed3adc05f3f27e1629547c8e8bb72187bf44a744660463774089332619"} Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.956472 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-create-fkmzk" podStartSLOduration=5.956449446 podStartE2EDuration="5.956449446s" podCreationTimestamp="2026-01-21 15:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:36.943163422 +0000 UTC m=+1301.867653044" watchObservedRunningTime="2026-01-21 15:45:36.956449446 +0000 UTC m=+1301.880939078" Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.956627 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jdjh8" event={"ID":"a667cd6a-52e3-4221-914f-c662638460d4","Type":"ContainerStarted","Data":"0b58605147550eaa1ec4e4a16e1a39556bd4a46c0a9ca61fdf3dc1fd8bfefc08"} Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.966957 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-6z796" podStartSLOduration=5.966936724 podStartE2EDuration="5.966936724s" podCreationTimestamp="2026-01-21 15:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:36.963740426 +0000 UTC m=+1301.888230068" watchObservedRunningTime="2026-01-21 15:45:36.966936724 +0000 UTC m=+1301.891426346" Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.968819 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-141e-account-create-update-z9rr9" event={"ID":"30fc1d3b-2e18-449e-87be-3cab6a8668a1","Type":"ContainerStarted","Data":"e07c31df794d0612bc360ba53bdab145e593b90d70d987c9422cc3936bf7e013"} Jan 21 15:45:36 crc kubenswrapper[4773]: I0121 15:45:36.983426 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-3d4e-account-create-update-zgrzh" podStartSLOduration=5.983385484 podStartE2EDuration="5.983385484s" podCreationTimestamp="2026-01-21 15:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:36.979017205 +0000 UTC m=+1301.903506837" watchObservedRunningTime="2026-01-21 15:45:36.983385484 +0000 UTC m=+1301.907875106" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.015284 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-141e-account-create-update-z9rr9" podStartSLOduration=6.015266917 podStartE2EDuration="6.015266917s" podCreationTimestamp="2026-01-21 15:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:37.010773354 +0000 UTC m=+1301.935262976" watchObservedRunningTime="2026-01-21 15:45:37.015266917 +0000 UTC m=+1301.939756539" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.064234 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-a51f-account-create-update-5nnlx" podStartSLOduration=6.064208028 podStartE2EDuration="6.064208028s" podCreationTimestamp="2026-01-21 15:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:37.050123752 +0000 UTC m=+1301.974613374" watchObservedRunningTime="2026-01-21 15:45:37.064208028 +0000 UTC m=+1301.988697650" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.084737 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-hvg5f" podStartSLOduration=6.084686609 podStartE2EDuration="6.084686609s" podCreationTimestamp="2026-01-21 15:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:37.073638286 +0000 UTC m=+1301.998127938" watchObservedRunningTime="2026-01-21 15:45:37.084686609 +0000 UTC m=+1302.009176251" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.411815 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6x9bw" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.437506 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6tg4p" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.526031 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hmw7\" (UniqueName: \"kubernetes.io/projected/71edab3a-2ae1-4703-a506-e2a278eb5542-kube-api-access-7hmw7\") pod \"71edab3a-2ae1-4703-a506-e2a278eb5542\" (UID: \"71edab3a-2ae1-4703-a506-e2a278eb5542\") " Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.526173 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71edab3a-2ae1-4703-a506-e2a278eb5542-operator-scripts\") pod \"71edab3a-2ae1-4703-a506-e2a278eb5542\" (UID: \"71edab3a-2ae1-4703-a506-e2a278eb5542\") " Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.527336 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71edab3a-2ae1-4703-a506-e2a278eb5542-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71edab3a-2ae1-4703-a506-e2a278eb5542" (UID: "71edab3a-2ae1-4703-a506-e2a278eb5542"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.532863 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71edab3a-2ae1-4703-a506-e2a278eb5542-kube-api-access-7hmw7" (OuterVolumeSpecName: "kube-api-access-7hmw7") pod "71edab3a-2ae1-4703-a506-e2a278eb5542" (UID: "71edab3a-2ae1-4703-a506-e2a278eb5542"). InnerVolumeSpecName "kube-api-access-7hmw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.628496 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84zzv\" (UniqueName: \"kubernetes.io/projected/f238aaa6-9768-4a13-b711-158160bfe40f-kube-api-access-84zzv\") pod \"f238aaa6-9768-4a13-b711-158160bfe40f\" (UID: \"f238aaa6-9768-4a13-b711-158160bfe40f\") " Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.629307 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f238aaa6-9768-4a13-b711-158160bfe40f-operator-scripts\") pod \"f238aaa6-9768-4a13-b711-158160bfe40f\" (UID: \"f238aaa6-9768-4a13-b711-158160bfe40f\") " Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.630718 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hmw7\" (UniqueName: \"kubernetes.io/projected/71edab3a-2ae1-4703-a506-e2a278eb5542-kube-api-access-7hmw7\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.630795 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f238aaa6-9768-4a13-b711-158160bfe40f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f238aaa6-9768-4a13-b711-158160bfe40f" (UID: "f238aaa6-9768-4a13-b711-158160bfe40f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.630849 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71edab3a-2ae1-4703-a506-e2a278eb5542-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.731921 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f238aaa6-9768-4a13-b711-158160bfe40f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.825532 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f238aaa6-9768-4a13-b711-158160bfe40f-kube-api-access-84zzv" (OuterVolumeSpecName: "kube-api-access-84zzv") pod "f238aaa6-9768-4a13-b711-158160bfe40f" (UID: "f238aaa6-9768-4a13-b711-158160bfe40f"). InnerVolumeSpecName "kube-api-access-84zzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.833145 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84zzv\" (UniqueName: \"kubernetes.io/projected/f238aaa6-9768-4a13-b711-158160bfe40f-kube-api-access-84zzv\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.996019 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6x9bw" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.997277 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6x9bw" event={"ID":"71edab3a-2ae1-4703-a506-e2a278eb5542","Type":"ContainerDied","Data":"4eab1ae0c60947065b19918e31b89c881014570f29df2f718a0e2598e7c2ca78"} Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.997334 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eab1ae0c60947065b19918e31b89c881014570f29df2f718a0e2598e7c2ca78" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.999516 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6tg4p" event={"ID":"f238aaa6-9768-4a13-b711-158160bfe40f","Type":"ContainerDied","Data":"db57d3f30fc9393a1341a6b68de62eb83cf69e3b71f588a0513953f0408dc7b4"} Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.999564 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db57d3f30fc9393a1341a6b68de62eb83cf69e3b71f588a0513953f0408dc7b4" Jan 21 15:45:37 crc kubenswrapper[4773]: I0121 15:45:37.999631 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6tg4p" Jan 21 15:45:38 crc kubenswrapper[4773]: I0121 15:45:38.003986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"a69994a9d71727e911acad41e548aaccf3946724f0d83f91e7b79c5a01036267"} Jan 21 15:45:38 crc kubenswrapper[4773]: I0121 15:45:38.004039 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"6ab5a79fafa1e4c5b6276f9bb69de9355b3b1f5a131736fa3822fed2891f426f"} Jan 21 15:45:38 crc kubenswrapper[4773]: I0121 15:45:38.005804 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ab9197c-22f5-484b-b154-df64f7433d7d" containerID="a295ec39e5596e593ac63cfb20df027f6dca2405c2a6623e54ad5399a7982e12" exitCode=0 Jan 21 15:45:38 crc kubenswrapper[4773]: I0121 15:45:38.005841 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-fkmzk" event={"ID":"1ab9197c-22f5-484b-b154-df64f7433d7d","Type":"ContainerDied","Data":"a295ec39e5596e593ac63cfb20df027f6dca2405c2a6623e54ad5399a7982e12"} Jan 21 15:45:38 crc kubenswrapper[4773]: I0121 15:45:38.008972 4773 generic.go:334] "Generic (PLEG): container finished" podID="5dd750c5-d335-4d55-8967-f73c46478364" containerID="47688de5c00552e762952c1ff7a68d39a9f5a9d91ab0442f7c478ce771a4a42e" exitCode=0 Jan 21 15:45:38 crc kubenswrapper[4773]: I0121 15:45:38.009123 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zwrfs-config-nvbr6" event={"ID":"5dd750c5-d335-4d55-8967-f73c46478364","Type":"ContainerDied","Data":"47688de5c00552e762952c1ff7a68d39a9f5a9d91ab0442f7c478ce771a4a42e"} Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.033101 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"a13d770edd671aec9b2512a3499a06b9dc3d003c69a533656c133b33be2f31c1"} Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.034972 4773 generic.go:334] "Generic (PLEG): container finished" podID="46e304a8-01b0-46e8-85b0-d06af7a285c6" containerID="aa7619e4501048b3f9fc1c5340210df2fd925689278c2a084decd5c5c22e83df" exitCode=0 Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.035082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hvg5f" event={"ID":"46e304a8-01b0-46e8-85b0-d06af7a285c6","Type":"ContainerDied","Data":"aa7619e4501048b3f9fc1c5340210df2fd925689278c2a084decd5c5c22e83df"} Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.454915 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.463588 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-fkmzk" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.516534 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ab9197c-22f5-484b-b154-df64f7433d7d-operator-scripts\") pod \"1ab9197c-22f5-484b-b154-df64f7433d7d\" (UID: \"1ab9197c-22f5-484b-b154-df64f7433d7d\") " Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.541289 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab9197c-22f5-484b-b154-df64f7433d7d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ab9197c-22f5-484b-b154-df64f7433d7d" (UID: "1ab9197c-22f5-484b-b154-df64f7433d7d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.619450 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run-ovn\") pod \"5dd750c5-d335-4d55-8967-f73c46478364\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.619569 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-scripts\") pod \"5dd750c5-d335-4d55-8967-f73c46478364\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.619611 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run\") pod \"5dd750c5-d335-4d55-8967-f73c46478364\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.619664 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2gp9\" (UniqueName: \"kubernetes.io/projected/1ab9197c-22f5-484b-b154-df64f7433d7d-kube-api-access-p2gp9\") pod \"1ab9197c-22f5-484b-b154-df64f7433d7d\" (UID: \"1ab9197c-22f5-484b-b154-df64f7433d7d\") " Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.619707 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-log-ovn\") pod \"5dd750c5-d335-4d55-8967-f73c46478364\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.619729 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-additional-scripts\") pod \"5dd750c5-d335-4d55-8967-f73c46478364\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.619803 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xm89\" (UniqueName: \"kubernetes.io/projected/5dd750c5-d335-4d55-8967-f73c46478364-kube-api-access-7xm89\") pod \"5dd750c5-d335-4d55-8967-f73c46478364\" (UID: \"5dd750c5-d335-4d55-8967-f73c46478364\") " Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.620057 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5dd750c5-d335-4d55-8967-f73c46478364" (UID: "5dd750c5-d335-4d55-8967-f73c46478364"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.620036 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run" (OuterVolumeSpecName: "var-run") pod "5dd750c5-d335-4d55-8967-f73c46478364" (UID: "5dd750c5-d335-4d55-8967-f73c46478364"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.620142 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5dd750c5-d335-4d55-8967-f73c46478364" (UID: "5dd750c5-d335-4d55-8967-f73c46478364"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.620715 4773 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.620734 4773 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.620743 4773 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd750c5-d335-4d55-8967-f73c46478364-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.620751 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ab9197c-22f5-484b-b154-df64f7433d7d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.621092 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5dd750c5-d335-4d55-8967-f73c46478364" (UID: "5dd750c5-d335-4d55-8967-f73c46478364"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.621931 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-scripts" (OuterVolumeSpecName: "scripts") pod "5dd750c5-d335-4d55-8967-f73c46478364" (UID: "5dd750c5-d335-4d55-8967-f73c46478364"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.626448 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd750c5-d335-4d55-8967-f73c46478364-kube-api-access-7xm89" (OuterVolumeSpecName: "kube-api-access-7xm89") pod "5dd750c5-d335-4d55-8967-f73c46478364" (UID: "5dd750c5-d335-4d55-8967-f73c46478364"). InnerVolumeSpecName "kube-api-access-7xm89". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.626737 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab9197c-22f5-484b-b154-df64f7433d7d-kube-api-access-p2gp9" (OuterVolumeSpecName: "kube-api-access-p2gp9") pod "1ab9197c-22f5-484b-b154-df64f7433d7d" (UID: "1ab9197c-22f5-484b-b154-df64f7433d7d"). InnerVolumeSpecName "kube-api-access-p2gp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.723111 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2gp9\" (UniqueName: \"kubernetes.io/projected/1ab9197c-22f5-484b-b154-df64f7433d7d-kube-api-access-p2gp9\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.723153 4773 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.723167 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xm89\" (UniqueName: \"kubernetes.io/projected/5dd750c5-d335-4d55-8967-f73c46478364-kube-api-access-7xm89\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:39 crc kubenswrapper[4773]: I0121 15:45:39.723179 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5dd750c5-d335-4d55-8967-f73c46478364-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:40 crc kubenswrapper[4773]: I0121 15:45:40.046369 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zwrfs-config-nvbr6" event={"ID":"5dd750c5-d335-4d55-8967-f73c46478364","Type":"ContainerDied","Data":"9b2cc654246f0cfd822be786654960dbe7f874a654a5d7e67bdc4096f102f446"} Jan 21 15:45:40 crc kubenswrapper[4773]: I0121 15:45:40.046406 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2cc654246f0cfd822be786654960dbe7f874a654a5d7e67bdc4096f102f446" Jan 21 15:45:40 crc kubenswrapper[4773]: I0121 15:45:40.046466 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zwrfs-config-nvbr6" Jan 21 15:45:40 crc kubenswrapper[4773]: I0121 15:45:40.049869 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-fkmzk" event={"ID":"1ab9197c-22f5-484b-b154-df64f7433d7d","Type":"ContainerDied","Data":"4c1452100bcb6bdb4f430b9186cf6bfb765c540c6e7814100e6002f8eef63fa8"} Jan 21 15:45:40 crc kubenswrapper[4773]: I0121 15:45:40.049901 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c1452100bcb6bdb4f430b9186cf6bfb765c540c6e7814100e6002f8eef63fa8" Jan 21 15:45:40 crc kubenswrapper[4773]: I0121 15:45:40.049931 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-fkmzk" Jan 21 15:45:40 crc kubenswrapper[4773]: I0121 15:45:40.069992 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-zwrfs" Jan 21 15:45:40 crc kubenswrapper[4773]: I0121 15:45:40.544662 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zwrfs-config-nvbr6"] Jan 21 15:45:40 crc kubenswrapper[4773]: I0121 15:45:40.552870 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zwrfs-config-nvbr6"] Jan 21 15:45:41 crc kubenswrapper[4773]: I0121 15:45:41.059197 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8","Type":"ContainerStarted","Data":"d90b8e5cef415827288160e0a5879402bc6e7fca82df25581001116c0fdac731"} Jan 21 15:45:41 crc kubenswrapper[4773]: I0121 15:45:41.061332 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hvg5f" event={"ID":"46e304a8-01b0-46e8-85b0-d06af7a285c6","Type":"ContainerDied","Data":"e6ff0ba9c4621c53243d857cd3a74921ffa12eaa59cf905287cc928ddffa71e0"} Jan 21 15:45:41 crc kubenswrapper[4773]: I0121 15:45:41.061369 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6ff0ba9c4621c53243d857cd3a74921ffa12eaa59cf905287cc928ddffa71e0" Jan 21 15:45:41 crc kubenswrapper[4773]: I0121 15:45:41.098598 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hvg5f" Jan 21 15:45:41 crc kubenswrapper[4773]: I0121 15:45:41.267027 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bcms\" (UniqueName: \"kubernetes.io/projected/46e304a8-01b0-46e8-85b0-d06af7a285c6-kube-api-access-7bcms\") pod \"46e304a8-01b0-46e8-85b0-d06af7a285c6\" (UID: \"46e304a8-01b0-46e8-85b0-d06af7a285c6\") " Jan 21 15:45:41 crc kubenswrapper[4773]: I0121 15:45:41.267251 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e304a8-01b0-46e8-85b0-d06af7a285c6-operator-scripts\") pod \"46e304a8-01b0-46e8-85b0-d06af7a285c6\" (UID: \"46e304a8-01b0-46e8-85b0-d06af7a285c6\") " Jan 21 15:45:41 crc kubenswrapper[4773]: I0121 15:45:41.267906 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e304a8-01b0-46e8-85b0-d06af7a285c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46e304a8-01b0-46e8-85b0-d06af7a285c6" (UID: "46e304a8-01b0-46e8-85b0-d06af7a285c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4773]: I0121 15:45:41.277566 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e304a8-01b0-46e8-85b0-d06af7a285c6-kube-api-access-7bcms" (OuterVolumeSpecName: "kube-api-access-7bcms") pod "46e304a8-01b0-46e8-85b0-d06af7a285c6" (UID: "46e304a8-01b0-46e8-85b0-d06af7a285c6"). InnerVolumeSpecName "kube-api-access-7bcms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:41 crc kubenswrapper[4773]: I0121 15:45:41.369249 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e304a8-01b0-46e8-85b0-d06af7a285c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4773]: I0121 15:45:41.369285 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bcms\" (UniqueName: \"kubernetes.io/projected/46e304a8-01b0-46e8-85b0-d06af7a285c6-kube-api-access-7bcms\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:41 crc kubenswrapper[4773]: I0121 15:45:41.397370 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd750c5-d335-4d55-8967-f73c46478364" path="/var/lib/kubelet/pods/5dd750c5-d335-4d55-8967-f73c46478364/volumes" Jan 21 15:45:42 crc kubenswrapper[4773]: I0121 15:45:42.071787 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hvg5f" Jan 21 15:45:43 crc kubenswrapper[4773]: I0121 15:45:43.085223 4773 generic.go:334] "Generic (PLEG): container finished" podID="30fc1d3b-2e18-449e-87be-3cab6a8668a1" containerID="e07c31df794d0612bc360ba53bdab145e593b90d70d987c9422cc3936bf7e013" exitCode=0 Jan 21 15:45:43 crc kubenswrapper[4773]: I0121 15:45:43.085308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-141e-account-create-update-z9rr9" event={"ID":"30fc1d3b-2e18-449e-87be-3cab6a8668a1","Type":"ContainerDied","Data":"e07c31df794d0612bc360ba53bdab145e593b90d70d987c9422cc3936bf7e013"} Jan 21 15:45:43 crc kubenswrapper[4773]: I0121 15:45:43.090327 4773 generic.go:334] "Generic (PLEG): container finished" podID="035696a6-e39b-48f7-acc6-3dc896cfbec2" containerID="a5b365f3f16fc1d1d7ebc437e17f6ecd97b9068825e8729cf40a028822d55544" exitCode=0 Jan 21 15:45:43 crc kubenswrapper[4773]: I0121 15:45:43.090411 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6z796" event={"ID":"035696a6-e39b-48f7-acc6-3dc896cfbec2","Type":"ContainerDied","Data":"a5b365f3f16fc1d1d7ebc437e17f6ecd97b9068825e8729cf40a028822d55544"} Jan 21 15:45:43 crc kubenswrapper[4773]: I0121 15:45:43.106762 4773 generic.go:334] "Generic (PLEG): container finished" podID="ce0bf1bf-486e-40e7-80cb-4eff17210708" containerID="8d02e47ccecc0ffee1f8917b3077744d2c6dff83cf1b69b4bf6fb3c17250ac31" exitCode=0 Jan 21 15:45:43 crc kubenswrapper[4773]: I0121 15:45:43.106905 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3d4e-account-create-update-zgrzh" event={"ID":"ce0bf1bf-486e-40e7-80cb-4eff17210708","Type":"ContainerDied","Data":"8d02e47ccecc0ffee1f8917b3077744d2c6dff83cf1b69b4bf6fb3c17250ac31"} Jan 21 15:45:43 crc kubenswrapper[4773]: I0121 15:45:43.108796 4773 generic.go:334] "Generic (PLEG): container finished" podID="46e701e1-3d35-4188-9fe4-8e25b7e0e99e" containerID="dbc0f2ed3adc05f3f27e1629547c8e8bb72187bf44a744660463774089332619" exitCode=0 Jan 21 15:45:43 crc kubenswrapper[4773]: I0121 15:45:43.108864 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a51f-account-create-update-5nnlx" event={"ID":"46e701e1-3d35-4188-9fe4-8e25b7e0e99e","Type":"ContainerDied","Data":"dbc0f2ed3adc05f3f27e1629547c8e8bb72187bf44a744660463774089332619"} Jan 21 15:45:43 crc kubenswrapper[4773]: I0121 15:45:43.110674 4773 generic.go:334] "Generic (PLEG): container finished" podID="e14d0772-1452-4862-ad42-7c992e1bc03a" containerID="8adec2fc1a6166c1ee1684c9fdb6cd207fdbc00362a29c3fe4922b7abd97de24" exitCode=0 Jan 21 15:45:43 crc kubenswrapper[4773]: I0121 15:45:43.110735 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-8e6c-account-create-update-8642z" event={"ID":"e14d0772-1452-4862-ad42-7c992e1bc03a","Type":"ContainerDied","Data":"8adec2fc1a6166c1ee1684c9fdb6cd207fdbc00362a29c3fe4922b7abd97de24"} Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.488519 4773 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod9c654304-9ce6-4243-9273-bfd23bdc0ac8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod9c654304-9ce6-4243-9273-bfd23bdc0ac8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9c654304_9ce6_4243_9273_bfd23bdc0ac8.slice" Jan 21 15:45:45 crc kubenswrapper[4773]: E0121 15:45:45.488855 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod9c654304-9ce6-4243-9273-bfd23bdc0ac8] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod9c654304-9ce6-4243-9273-bfd23bdc0ac8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9c654304_9ce6_4243_9273_bfd23bdc0ac8.slice" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" podUID="9c654304-9ce6-4243-9273-bfd23bdc0ac8" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.683820 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-141e-account-create-update-z9rr9" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.693929 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a51f-account-create-update-5nnlx" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.701151 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-8e6c-account-create-update-8642z" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.707776 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3d4e-account-create-update-zgrzh" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.720643 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6z796" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.851345 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmvrb\" (UniqueName: \"kubernetes.io/projected/e14d0772-1452-4862-ad42-7c992e1bc03a-kube-api-access-fmvrb\") pod \"e14d0772-1452-4862-ad42-7c992e1bc03a\" (UID: \"e14d0772-1452-4862-ad42-7c992e1bc03a\") " Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.851413 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48l5l\" (UniqueName: \"kubernetes.io/projected/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-kube-api-access-48l5l\") pod \"46e701e1-3d35-4188-9fe4-8e25b7e0e99e\" (UID: \"46e701e1-3d35-4188-9fe4-8e25b7e0e99e\") " Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.851508 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2fcf\" (UniqueName: \"kubernetes.io/projected/ce0bf1bf-486e-40e7-80cb-4eff17210708-kube-api-access-t2fcf\") pod \"ce0bf1bf-486e-40e7-80cb-4eff17210708\" (UID: \"ce0bf1bf-486e-40e7-80cb-4eff17210708\") " Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.851561 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlrwp\" (UniqueName: \"kubernetes.io/projected/30fc1d3b-2e18-449e-87be-3cab6a8668a1-kube-api-access-qlrwp\") pod \"30fc1d3b-2e18-449e-87be-3cab6a8668a1\" (UID: \"30fc1d3b-2e18-449e-87be-3cab6a8668a1\") " Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.851591 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csg6q\" (UniqueName: \"kubernetes.io/projected/035696a6-e39b-48f7-acc6-3dc896cfbec2-kube-api-access-csg6q\") pod \"035696a6-e39b-48f7-acc6-3dc896cfbec2\" (UID: \"035696a6-e39b-48f7-acc6-3dc896cfbec2\") " Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.851641 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e14d0772-1452-4862-ad42-7c992e1bc03a-operator-scripts\") pod \"e14d0772-1452-4862-ad42-7c992e1bc03a\" (UID: \"e14d0772-1452-4862-ad42-7c992e1bc03a\") " Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.851748 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce0bf1bf-486e-40e7-80cb-4eff17210708-operator-scripts\") pod \"ce0bf1bf-486e-40e7-80cb-4eff17210708\" (UID: \"ce0bf1bf-486e-40e7-80cb-4eff17210708\") " Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.851777 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30fc1d3b-2e18-449e-87be-3cab6a8668a1-operator-scripts\") pod \"30fc1d3b-2e18-449e-87be-3cab6a8668a1\" (UID: \"30fc1d3b-2e18-449e-87be-3cab6a8668a1\") " Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.851798 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-operator-scripts\") pod \"46e701e1-3d35-4188-9fe4-8e25b7e0e99e\" (UID: \"46e701e1-3d35-4188-9fe4-8e25b7e0e99e\") " Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.851822 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035696a6-e39b-48f7-acc6-3dc896cfbec2-operator-scripts\") pod \"035696a6-e39b-48f7-acc6-3dc896cfbec2\" (UID: \"035696a6-e39b-48f7-acc6-3dc896cfbec2\") " Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.854008 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035696a6-e39b-48f7-acc6-3dc896cfbec2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "035696a6-e39b-48f7-acc6-3dc896cfbec2" (UID: "035696a6-e39b-48f7-acc6-3dc896cfbec2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.854547 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e14d0772-1452-4862-ad42-7c992e1bc03a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e14d0772-1452-4862-ad42-7c992e1bc03a" (UID: "e14d0772-1452-4862-ad42-7c992e1bc03a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.854956 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0bf1bf-486e-40e7-80cb-4eff17210708-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce0bf1bf-486e-40e7-80cb-4eff17210708" (UID: "ce0bf1bf-486e-40e7-80cb-4eff17210708"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.855244 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30fc1d3b-2e18-449e-87be-3cab6a8668a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30fc1d3b-2e18-449e-87be-3cab6a8668a1" (UID: "30fc1d3b-2e18-449e-87be-3cab6a8668a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.855378 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46e701e1-3d35-4188-9fe4-8e25b7e0e99e" (UID: "46e701e1-3d35-4188-9fe4-8e25b7e0e99e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.858718 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0bf1bf-486e-40e7-80cb-4eff17210708-kube-api-access-t2fcf" (OuterVolumeSpecName: "kube-api-access-t2fcf") pod "ce0bf1bf-486e-40e7-80cb-4eff17210708" (UID: "ce0bf1bf-486e-40e7-80cb-4eff17210708"). InnerVolumeSpecName "kube-api-access-t2fcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.858785 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e14d0772-1452-4862-ad42-7c992e1bc03a-kube-api-access-fmvrb" (OuterVolumeSpecName: "kube-api-access-fmvrb") pod "e14d0772-1452-4862-ad42-7c992e1bc03a" (UID: "e14d0772-1452-4862-ad42-7c992e1bc03a"). InnerVolumeSpecName "kube-api-access-fmvrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.861259 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035696a6-e39b-48f7-acc6-3dc896cfbec2-kube-api-access-csg6q" (OuterVolumeSpecName: "kube-api-access-csg6q") pod "035696a6-e39b-48f7-acc6-3dc896cfbec2" (UID: "035696a6-e39b-48f7-acc6-3dc896cfbec2"). InnerVolumeSpecName "kube-api-access-csg6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.874273 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30fc1d3b-2e18-449e-87be-3cab6a8668a1-kube-api-access-qlrwp" (OuterVolumeSpecName: "kube-api-access-qlrwp") pod "30fc1d3b-2e18-449e-87be-3cab6a8668a1" (UID: "30fc1d3b-2e18-449e-87be-3cab6a8668a1"). InnerVolumeSpecName "kube-api-access-qlrwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.876780 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-kube-api-access-48l5l" (OuterVolumeSpecName: "kube-api-access-48l5l") pod "46e701e1-3d35-4188-9fe4-8e25b7e0e99e" (UID: "46e701e1-3d35-4188-9fe4-8e25b7e0e99e"). InnerVolumeSpecName "kube-api-access-48l5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.953961 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2fcf\" (UniqueName: \"kubernetes.io/projected/ce0bf1bf-486e-40e7-80cb-4eff17210708-kube-api-access-t2fcf\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.953993 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlrwp\" (UniqueName: \"kubernetes.io/projected/30fc1d3b-2e18-449e-87be-3cab6a8668a1-kube-api-access-qlrwp\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.954006 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csg6q\" (UniqueName: \"kubernetes.io/projected/035696a6-e39b-48f7-acc6-3dc896cfbec2-kube-api-access-csg6q\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.954016 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e14d0772-1452-4862-ad42-7c992e1bc03a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.954027 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce0bf1bf-486e-40e7-80cb-4eff17210708-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.954038 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30fc1d3b-2e18-449e-87be-3cab6a8668a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.954046 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.954058 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035696a6-e39b-48f7-acc6-3dc896cfbec2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.954066 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmvrb\" (UniqueName: \"kubernetes.io/projected/e14d0772-1452-4862-ad42-7c992e1bc03a-kube-api-access-fmvrb\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:45 crc kubenswrapper[4773]: I0121 15:45:45.954075 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48l5l\" (UniqueName: \"kubernetes.io/projected/46e701e1-3d35-4188-9fe4-8e25b7e0e99e-kube-api-access-48l5l\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.138564 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-8e6c-account-create-update-8642z" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.138559 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-8e6c-account-create-update-8642z" event={"ID":"e14d0772-1452-4862-ad42-7c992e1bc03a","Type":"ContainerDied","Data":"93cfbd32a7c9f9eecae72d8f640a789593bfd7e11c292594d5c782a0b75349ae"} Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.138709 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93cfbd32a7c9f9eecae72d8f640a789593bfd7e11c292594d5c782a0b75349ae" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.140629 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-141e-account-create-update-z9rr9" event={"ID":"30fc1d3b-2e18-449e-87be-3cab6a8668a1","Type":"ContainerDied","Data":"146f8a2bbb6509b93e6c2ea4f3c8d49f206ac8f627ca835581b696aaf4291322"} Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.140667 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="146f8a2bbb6509b93e6c2ea4f3c8d49f206ac8f627ca835581b696aaf4291322" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.140641 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-141e-account-create-update-z9rr9" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.142128 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6z796" event={"ID":"035696a6-e39b-48f7-acc6-3dc896cfbec2","Type":"ContainerDied","Data":"75518c83389c24ee19ec9e424374f1c6642efedb15a5c8889dd68b0b5e1b7906"} Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.142157 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75518c83389c24ee19ec9e424374f1c6642efedb15a5c8889dd68b0b5e1b7906" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.142166 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6z796" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.143557 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3d4e-account-create-update-zgrzh" event={"ID":"ce0bf1bf-486e-40e7-80cb-4eff17210708","Type":"ContainerDied","Data":"f21bfbca86763c1a7e063002064ab46c706e14d4a8f60af97e328e92cd6577a4"} Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.143585 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f21bfbca86763c1a7e063002064ab46c706e14d4a8f60af97e328e92cd6577a4" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.143644 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3d4e-account-create-update-zgrzh" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.146650 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-rxztr" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.147226 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a51f-account-create-update-5nnlx" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.148243 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a51f-account-create-update-5nnlx" event={"ID":"46e701e1-3d35-4188-9fe4-8e25b7e0e99e","Type":"ContainerDied","Data":"382e49b6d26e97961c71855fc55c0ef736c0c70b3630fb2e5834805fda1f29b5"} Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.148266 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="382e49b6d26e97961c71855fc55c0ef736c0c70b3630fb2e5834805fda1f29b5" Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.193318 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-rxztr"] Jan 21 15:45:46 crc kubenswrapper[4773]: I0121 15:45:46.203342 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-rxztr"] Jan 21 15:45:47 crc kubenswrapper[4773]: I0121 15:45:47.158636 4773 generic.go:334] "Generic (PLEG): container finished" podID="16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8" containerID="d90b8e5cef415827288160e0a5879402bc6e7fca82df25581001116c0fdac731" exitCode=0 Jan 21 15:45:47 crc kubenswrapper[4773]: I0121 15:45:47.158742 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8","Type":"ContainerDied","Data":"d90b8e5cef415827288160e0a5879402bc6e7fca82df25581001116c0fdac731"} Jan 21 15:45:47 crc kubenswrapper[4773]: I0121 15:45:47.397137 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c654304-9ce6-4243-9273-bfd23bdc0ac8" path="/var/lib/kubelet/pods/9c654304-9ce6-4243-9273-bfd23bdc0ac8/volumes" Jan 21 15:45:47 crc kubenswrapper[4773]: I0121 15:45:47.753201 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6z796"] Jan 21 15:45:47 crc kubenswrapper[4773]: I0121 15:45:47.762704 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6z796"] Jan 21 15:45:49 crc kubenswrapper[4773]: I0121 15:45:49.395326 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035696a6-e39b-48f7-acc6-3dc896cfbec2" path="/var/lib/kubelet/pods/035696a6-e39b-48f7-acc6-3dc896cfbec2/volumes" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.403530 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hvk82"] Jan 21 15:45:51 crc kubenswrapper[4773]: E0121 15:45:51.404113 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e304a8-01b0-46e8-85b0-d06af7a285c6" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404124 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e304a8-01b0-46e8-85b0-d06af7a285c6" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: E0121 15:45:51.404136 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd750c5-d335-4d55-8967-f73c46478364" containerName="ovn-config" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404142 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd750c5-d335-4d55-8967-f73c46478364" containerName="ovn-config" Jan 21 15:45:51 crc kubenswrapper[4773]: E0121 15:45:51.404157 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e701e1-3d35-4188-9fe4-8e25b7e0e99e" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404163 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e701e1-3d35-4188-9fe4-8e25b7e0e99e" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: E0121 15:45:51.404179 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14d0772-1452-4862-ad42-7c992e1bc03a" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404185 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14d0772-1452-4862-ad42-7c992e1bc03a" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: E0121 15:45:51.404192 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f238aaa6-9768-4a13-b711-158160bfe40f" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404198 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f238aaa6-9768-4a13-b711-158160bfe40f" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: E0121 15:45:51.404208 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30fc1d3b-2e18-449e-87be-3cab6a8668a1" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404213 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fc1d3b-2e18-449e-87be-3cab6a8668a1" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: E0121 15:45:51.404225 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035696a6-e39b-48f7-acc6-3dc896cfbec2" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404231 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="035696a6-e39b-48f7-acc6-3dc896cfbec2" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: E0121 15:45:51.404242 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0bf1bf-486e-40e7-80cb-4eff17210708" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404247 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0bf1bf-486e-40e7-80cb-4eff17210708" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: E0121 15:45:51.404256 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab9197c-22f5-484b-b154-df64f7433d7d" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404262 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab9197c-22f5-484b-b154-df64f7433d7d" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: E0121 15:45:51.404272 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71edab3a-2ae1-4703-a506-e2a278eb5542" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404278 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="71edab3a-2ae1-4703-a506-e2a278eb5542" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404428 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e14d0772-1452-4862-ad42-7c992e1bc03a" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404448 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e304a8-01b0-46e8-85b0-d06af7a285c6" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404467 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab9197c-22f5-484b-b154-df64f7433d7d" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404480 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd750c5-d335-4d55-8967-f73c46478364" containerName="ovn-config" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404493 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f238aaa6-9768-4a13-b711-158160bfe40f" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.404509 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0bf1bf-486e-40e7-80cb-4eff17210708" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.405817 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="30fc1d3b-2e18-449e-87be-3cab6a8668a1" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.405877 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="71edab3a-2ae1-4703-a506-e2a278eb5542" containerName="mariadb-database-create" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.405901 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e701e1-3d35-4188-9fe4-8e25b7e0e99e" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.405960 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="035696a6-e39b-48f7-acc6-3dc896cfbec2" containerName="mariadb-account-create-update" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.406882 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hvk82" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.409150 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.414327 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hvk82"] Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.570002 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadd8427-70cb-43e7-b550-0cfb6d400987-operator-scripts\") pod \"root-account-create-update-hvk82\" (UID: \"eadd8427-70cb-43e7-b550-0cfb6d400987\") " pod="openstack/root-account-create-update-hvk82" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.570059 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2szwc\" (UniqueName: \"kubernetes.io/projected/eadd8427-70cb-43e7-b550-0cfb6d400987-kube-api-access-2szwc\") pod \"root-account-create-update-hvk82\" (UID: \"eadd8427-70cb-43e7-b550-0cfb6d400987\") " pod="openstack/root-account-create-update-hvk82" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.676385 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadd8427-70cb-43e7-b550-0cfb6d400987-operator-scripts\") pod \"root-account-create-update-hvk82\" (UID: \"eadd8427-70cb-43e7-b550-0cfb6d400987\") " pod="openstack/root-account-create-update-hvk82" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.676432 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2szwc\" (UniqueName: \"kubernetes.io/projected/eadd8427-70cb-43e7-b550-0cfb6d400987-kube-api-access-2szwc\") pod \"root-account-create-update-hvk82\" (UID: \"eadd8427-70cb-43e7-b550-0cfb6d400987\") " pod="openstack/root-account-create-update-hvk82" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.677827 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadd8427-70cb-43e7-b550-0cfb6d400987-operator-scripts\") pod \"root-account-create-update-hvk82\" (UID: \"eadd8427-70cb-43e7-b550-0cfb6d400987\") " pod="openstack/root-account-create-update-hvk82" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.696250 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2szwc\" (UniqueName: \"kubernetes.io/projected/eadd8427-70cb-43e7-b550-0cfb6d400987-kube-api-access-2szwc\") pod \"root-account-create-update-hvk82\" (UID: \"eadd8427-70cb-43e7-b550-0cfb6d400987\") " pod="openstack/root-account-create-update-hvk82" Jan 21 15:45:51 crc kubenswrapper[4773]: I0121 15:45:51.733442 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hvk82" Jan 21 15:45:53 crc kubenswrapper[4773]: E0121 15:45:53.980852 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Jan 21 15:45:53 crc kubenswrapper[4773]: E0121 15:45:53.982502 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2ql8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-7vcf9_openstack(3ee2313d-678e-487c-a4af-ae303d40bedd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:45:53 crc kubenswrapper[4773]: E0121 15:45:53.983899 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-7vcf9" podUID="3ee2313d-678e-487c-a4af-ae303d40bedd" Jan 21 15:45:54 crc kubenswrapper[4773]: E0121 15:45:54.271659 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-7vcf9" podUID="3ee2313d-678e-487c-a4af-ae303d40bedd" Jan 21 15:45:54 crc kubenswrapper[4773]: I0121 15:45:54.707616 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hvk82"] Jan 21 15:45:54 crc kubenswrapper[4773]: W0121 15:45:54.716897 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeadd8427_70cb_43e7_b550_0cfb6d400987.slice/crio-86f1e6ae20d4667401cc3fb20afeca95621162c0855bedba648fc792ae79d155 WatchSource:0}: Error finding container 86f1e6ae20d4667401cc3fb20afeca95621162c0855bedba648fc792ae79d155: Status 404 returned error can't find the container with id 86f1e6ae20d4667401cc3fb20afeca95621162c0855bedba648fc792ae79d155 Jan 21 15:45:55 crc kubenswrapper[4773]: I0121 15:45:55.249535 4773 generic.go:334] "Generic (PLEG): container finished" podID="eadd8427-70cb-43e7-b550-0cfb6d400987" containerID="5b8053e13b140d3e520fc9a9279fb3a235af358f3f6b88ff28f0e527dd6af165" exitCode=0 Jan 21 15:45:55 crc kubenswrapper[4773]: I0121 15:45:55.249822 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hvk82" event={"ID":"eadd8427-70cb-43e7-b550-0cfb6d400987","Type":"ContainerDied","Data":"5b8053e13b140d3e520fc9a9279fb3a235af358f3f6b88ff28f0e527dd6af165"} Jan 21 15:45:55 crc kubenswrapper[4773]: I0121 15:45:55.250009 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hvk82" event={"ID":"eadd8427-70cb-43e7-b550-0cfb6d400987","Type":"ContainerStarted","Data":"86f1e6ae20d4667401cc3fb20afeca95621162c0855bedba648fc792ae79d155"} Jan 21 15:45:55 crc kubenswrapper[4773]: I0121 15:45:55.252535 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jdjh8" event={"ID":"a667cd6a-52e3-4221-914f-c662638460d4","Type":"ContainerStarted","Data":"a1f7869f1fa2684a7a9bc185781c6e35baf7b9a82cf4ec4f6d2b6341afff411f"} Jan 21 15:45:55 crc kubenswrapper[4773]: I0121 15:45:55.256979 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8","Type":"ContainerStarted","Data":"7a45fcd3f7b6de2628bac2abeea03d2b51bbc40a43edbb2a60ba6b6b88f7f151"} Jan 21 15:45:55 crc kubenswrapper[4773]: I0121 15:45:55.263947 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"cff8ce1eb0ef6de8cf1c6bac1518fd5c9d687bd2b10db4c7894e6c4d72df3e05"} Jan 21 15:45:55 crc kubenswrapper[4773]: I0121 15:45:55.264000 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"5e4ea176522f175ac2281bb81b7f964cf22fb2efd9c6b3e03958dc564676aba4"} Jan 21 15:45:55 crc kubenswrapper[4773]: I0121 15:45:55.264224 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"99a9868b9d1ec555b98962666a8f202e3e99c887d357b2f59c739e288efe8b4e"} Jan 21 15:45:55 crc kubenswrapper[4773]: I0121 15:45:55.264248 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"196d742f48daa774c37032c82810b2b85dcf12f289228830b43027c9248603a2"} Jan 21 15:45:55 crc kubenswrapper[4773]: I0121 15:45:55.293356 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jdjh8" podStartSLOduration=3.084301627 podStartE2EDuration="21.293333031s" podCreationTimestamp="2026-01-21 15:45:34 +0000 UTC" firstStartedPulling="2026-01-21 15:45:36.017016443 +0000 UTC m=+1300.941506065" lastFinishedPulling="2026-01-21 15:45:54.226047847 +0000 UTC m=+1319.150537469" observedRunningTime="2026-01-21 15:45:55.280591582 +0000 UTC m=+1320.205081224" watchObservedRunningTime="2026-01-21 15:45:55.293333031 +0000 UTC m=+1320.217822663" Jan 21 15:45:56 crc kubenswrapper[4773]: I0121 15:45:56.627749 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hvk82" Jan 21 15:45:56 crc kubenswrapper[4773]: I0121 15:45:56.790024 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadd8427-70cb-43e7-b550-0cfb6d400987-operator-scripts\") pod \"eadd8427-70cb-43e7-b550-0cfb6d400987\" (UID: \"eadd8427-70cb-43e7-b550-0cfb6d400987\") " Jan 21 15:45:56 crc kubenswrapper[4773]: I0121 15:45:56.790156 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2szwc\" (UniqueName: \"kubernetes.io/projected/eadd8427-70cb-43e7-b550-0cfb6d400987-kube-api-access-2szwc\") pod \"eadd8427-70cb-43e7-b550-0cfb6d400987\" (UID: \"eadd8427-70cb-43e7-b550-0cfb6d400987\") " Jan 21 15:45:56 crc kubenswrapper[4773]: I0121 15:45:56.790820 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eadd8427-70cb-43e7-b550-0cfb6d400987-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eadd8427-70cb-43e7-b550-0cfb6d400987" (UID: "eadd8427-70cb-43e7-b550-0cfb6d400987"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:45:56 crc kubenswrapper[4773]: I0121 15:45:56.796796 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eadd8427-70cb-43e7-b550-0cfb6d400987-kube-api-access-2szwc" (OuterVolumeSpecName: "kube-api-access-2szwc") pod "eadd8427-70cb-43e7-b550-0cfb6d400987" (UID: "eadd8427-70cb-43e7-b550-0cfb6d400987"). InnerVolumeSpecName "kube-api-access-2szwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:45:56 crc kubenswrapper[4773]: I0121 15:45:56.892555 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2szwc\" (UniqueName: \"kubernetes.io/projected/eadd8427-70cb-43e7-b550-0cfb6d400987-kube-api-access-2szwc\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:56 crc kubenswrapper[4773]: I0121 15:45:56.892807 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadd8427-70cb-43e7-b550-0cfb6d400987-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:45:57 crc kubenswrapper[4773]: I0121 15:45:57.283955 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8","Type":"ContainerStarted","Data":"ce0aa885b268f9b97810e801555d64e97bc724780971b4fab60b31ba2214a6ee"} Jan 21 15:45:57 crc kubenswrapper[4773]: I0121 15:45:57.291252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"59dd96d947c6bb3568a2f03e7222a28adeffa09c7ca4b066b50220b04f2bfd68"} Jan 21 15:45:57 crc kubenswrapper[4773]: I0121 15:45:57.291305 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"faef83c10998516d8cc8f5a709388d1684f00dd60df4447d978831a5967b9279"} Jan 21 15:45:57 crc kubenswrapper[4773]: I0121 15:45:57.291318 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"f8980437310058709a0b1cc5ea2a001491bb8cc06a9c71b10bf5235895da57b1"} Jan 21 15:45:57 crc kubenswrapper[4773]: I0121 15:45:57.293075 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hvk82" event={"ID":"eadd8427-70cb-43e7-b550-0cfb6d400987","Type":"ContainerDied","Data":"86f1e6ae20d4667401cc3fb20afeca95621162c0855bedba648fc792ae79d155"} Jan 21 15:45:57 crc kubenswrapper[4773]: I0121 15:45:57.293116 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86f1e6ae20d4667401cc3fb20afeca95621162c0855bedba648fc792ae79d155" Jan 21 15:45:57 crc kubenswrapper[4773]: I0121 15:45:57.293145 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hvk82" Jan 21 15:45:57 crc kubenswrapper[4773]: I0121 15:45:57.807593 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hvk82"] Jan 21 15:45:57 crc kubenswrapper[4773]: I0121 15:45:57.825207 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hvk82"] Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.319942 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8","Type":"ContainerStarted","Data":"dfb11afbd987cf0b8026d1011a3b2b48a210279985a27bde5bc1a898939ccfb0"} Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.326850 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"f8bcb840fe5bc80e2b4c00fe5c53b8304518e89b1ca698f38bc2b6e93ff0ce86"} Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.327133 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"57ef49b7751d91edfd9cf3d8cecc6e28883c69e80603b40d8638c52f0808c453"} Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.327216 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"fa0164789c08876e55c53e44e3f8c0cfe10cfb352363acd47fbd5883c46e36da"} Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.327286 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"111512a9-4e17-4433-a7e9-e8666099d12f","Type":"ContainerStarted","Data":"d8e6b0da1c434ef6527208a39b94e3178b490312256efacb2d8563689b9d1d09"} Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.359136 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=25.359116527 podStartE2EDuration="25.359116527s" podCreationTimestamp="2026-01-21 15:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:45:58.350978255 +0000 UTC m=+1323.275467877" watchObservedRunningTime="2026-01-21 15:45:58.359116527 +0000 UTC m=+1323.283606149" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.387898 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.514112625 podStartE2EDuration="1m2.387882705s" podCreationTimestamp="2026-01-21 15:44:56 +0000 UTC" firstStartedPulling="2026-01-21 15:45:31.614097541 +0000 UTC m=+1296.538587163" lastFinishedPulling="2026-01-21 15:45:56.487867621 +0000 UTC m=+1321.412357243" observedRunningTime="2026-01-21 15:45:58.384978956 +0000 UTC m=+1323.309468568" watchObservedRunningTime="2026-01-21 15:45:58.387882705 +0000 UTC m=+1323.312372327" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.665586 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-ddrwp"] Jan 21 15:45:58 crc kubenswrapper[4773]: E0121 15:45:58.666351 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eadd8427-70cb-43e7-b550-0cfb6d400987" containerName="mariadb-account-create-update" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.666376 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="eadd8427-70cb-43e7-b550-0cfb6d400987" containerName="mariadb-account-create-update" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.666618 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="eadd8427-70cb-43e7-b550-0cfb6d400987" containerName="mariadb-account-create-update" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.667856 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.670152 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.682450 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-ddrwp"] Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.825080 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt964\" (UniqueName: \"kubernetes.io/projected/10181654-0fe4-4966-92d5-821254a0c1dc-kube-api-access-kt964\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.825153 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-svc\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.825169 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.825437 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.825609 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-config\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.825798 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.927312 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-svc\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.927357 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.927414 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.927466 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-config\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.927520 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.927561 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt964\" (UniqueName: \"kubernetes.io/projected/10181654-0fe4-4966-92d5-821254a0c1dc-kube-api-access-kt964\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.928515 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.928649 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-config\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.928690 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.929130 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-svc\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.929407 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.955897 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt964\" (UniqueName: \"kubernetes.io/projected/10181654-0fe4-4966-92d5-821254a0c1dc-kube-api-access-kt964\") pod \"dnsmasq-dns-764c5664d7-ddrwp\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:58 crc kubenswrapper[4773]: I0121 15:45:58.988249 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:45:59 crc kubenswrapper[4773]: I0121 15:45:59.185837 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 21 15:45:59 crc kubenswrapper[4773]: I0121 15:45:59.339808 4773 generic.go:334] "Generic (PLEG): container finished" podID="a667cd6a-52e3-4221-914f-c662638460d4" containerID="a1f7869f1fa2684a7a9bc185781c6e35baf7b9a82cf4ec4f6d2b6341afff411f" exitCode=0 Jan 21 15:45:59 crc kubenswrapper[4773]: I0121 15:45:59.339898 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jdjh8" event={"ID":"a667cd6a-52e3-4221-914f-c662638460d4","Type":"ContainerDied","Data":"a1f7869f1fa2684a7a9bc185781c6e35baf7b9a82cf4ec4f6d2b6341afff411f"} Jan 21 15:45:59 crc kubenswrapper[4773]: W0121 15:45:59.392328 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10181654_0fe4_4966_92d5_821254a0c1dc.slice/crio-f284c48d5487d8050c692f03d45c19b08ca63e1a0bddc812483a7d011246a3b3 WatchSource:0}: Error finding container f284c48d5487d8050c692f03d45c19b08ca63e1a0bddc812483a7d011246a3b3: Status 404 returned error can't find the container with id f284c48d5487d8050c692f03d45c19b08ca63e1a0bddc812483a7d011246a3b3 Jan 21 15:45:59 crc kubenswrapper[4773]: I0121 15:45:59.399440 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eadd8427-70cb-43e7-b550-0cfb6d400987" path="/var/lib/kubelet/pods/eadd8427-70cb-43e7-b550-0cfb6d400987/volumes" Jan 21 15:45:59 crc kubenswrapper[4773]: I0121 15:45:59.400134 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-ddrwp"] Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.363008 4773 generic.go:334] "Generic (PLEG): container finished" podID="10181654-0fe4-4966-92d5-821254a0c1dc" containerID="a043764c953e9f3659ed8d550a61a6122f4e312733c61090b1c544adb810db22" exitCode=0 Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.363072 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" event={"ID":"10181654-0fe4-4966-92d5-821254a0c1dc","Type":"ContainerDied","Data":"a043764c953e9f3659ed8d550a61a6122f4e312733c61090b1c544adb810db22"} Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.364112 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" event={"ID":"10181654-0fe4-4966-92d5-821254a0c1dc","Type":"ContainerStarted","Data":"f284c48d5487d8050c692f03d45c19b08ca63e1a0bddc812483a7d011246a3b3"} Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.692299 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.767885 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-config-data\") pod \"a667cd6a-52e3-4221-914f-c662638460d4\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.768007 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-combined-ca-bundle\") pod \"a667cd6a-52e3-4221-914f-c662638460d4\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.768152 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crbvd\" (UniqueName: \"kubernetes.io/projected/a667cd6a-52e3-4221-914f-c662638460d4-kube-api-access-crbvd\") pod \"a667cd6a-52e3-4221-914f-c662638460d4\" (UID: \"a667cd6a-52e3-4221-914f-c662638460d4\") " Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.773858 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a667cd6a-52e3-4221-914f-c662638460d4-kube-api-access-crbvd" (OuterVolumeSpecName: "kube-api-access-crbvd") pod "a667cd6a-52e3-4221-914f-c662638460d4" (UID: "a667cd6a-52e3-4221-914f-c662638460d4"). InnerVolumeSpecName "kube-api-access-crbvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.801296 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a667cd6a-52e3-4221-914f-c662638460d4" (UID: "a667cd6a-52e3-4221-914f-c662638460d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.814458 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-config-data" (OuterVolumeSpecName: "config-data") pod "a667cd6a-52e3-4221-914f-c662638460d4" (UID: "a667cd6a-52e3-4221-914f-c662638460d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.870358 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.870403 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a667cd6a-52e3-4221-914f-c662638460d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:00 crc kubenswrapper[4773]: I0121 15:46:00.870416 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crbvd\" (UniqueName: \"kubernetes.io/projected/a667cd6a-52e3-4221-914f-c662638460d4-kube-api-access-crbvd\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.372482 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" event={"ID":"10181654-0fe4-4966-92d5-821254a0c1dc","Type":"ContainerStarted","Data":"c4cc05f13561187d5c5ed89d58971a1a03cae2f3f9bb826c9d25720de85847aa"} Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.373056 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.374289 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jdjh8" event={"ID":"a667cd6a-52e3-4221-914f-c662638460d4","Type":"ContainerDied","Data":"0b58605147550eaa1ec4e4a16e1a39556bd4a46c0a9ca61fdf3dc1fd8bfefc08"} Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.374315 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b58605147550eaa1ec4e4a16e1a39556bd4a46c0a9ca61fdf3dc1fd8bfefc08" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.374364 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jdjh8" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.420663 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" podStartSLOduration=3.420644467 podStartE2EDuration="3.420644467s" podCreationTimestamp="2026-01-21 15:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:01.420415741 +0000 UTC m=+1326.344905393" watchObservedRunningTime="2026-01-21 15:46:01.420644467 +0000 UTC m=+1326.345134089" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.448601 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-4qg94"] Jan 21 15:46:01 crc kubenswrapper[4773]: E0121 15:46:01.449026 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a667cd6a-52e3-4221-914f-c662638460d4" containerName="keystone-db-sync" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.449042 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a667cd6a-52e3-4221-914f-c662638460d4" containerName="keystone-db-sync" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.449227 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a667cd6a-52e3-4221-914f-c662638460d4" containerName="keystone-db-sync" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.450004 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4qg94" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.453318 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.459243 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4qg94"] Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.609783 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4ee4bc-4d4e-4227-b817-3de2e3860581-operator-scripts\") pod \"root-account-create-update-4qg94\" (UID: \"9e4ee4bc-4d4e-4227-b817-3de2e3860581\") " pod="openstack/root-account-create-update-4qg94" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.609850 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vprjn\" (UniqueName: \"kubernetes.io/projected/9e4ee4bc-4d4e-4227-b817-3de2e3860581-kube-api-access-vprjn\") pod \"root-account-create-update-4qg94\" (UID: \"9e4ee4bc-4d4e-4227-b817-3de2e3860581\") " pod="openstack/root-account-create-update-4qg94" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.653938 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-ddrwp"] Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.679864 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-p67qs"] Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.681420 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.686524 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.686970 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.687131 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.687429 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sf6hv" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.693998 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.697685 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p67qs"] Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.711497 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4ee4bc-4d4e-4227-b817-3de2e3860581-operator-scripts\") pod \"root-account-create-update-4qg94\" (UID: \"9e4ee4bc-4d4e-4227-b817-3de2e3860581\") " pod="openstack/root-account-create-update-4qg94" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.711565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vprjn\" (UniqueName: \"kubernetes.io/projected/9e4ee4bc-4d4e-4227-b817-3de2e3860581-kube-api-access-vprjn\") pod \"root-account-create-update-4qg94\" (UID: \"9e4ee4bc-4d4e-4227-b817-3de2e3860581\") " pod="openstack/root-account-create-update-4qg94" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.712618 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4ee4bc-4d4e-4227-b817-3de2e3860581-operator-scripts\") pod \"root-account-create-update-4qg94\" (UID: \"9e4ee4bc-4d4e-4227-b817-3de2e3860581\") " pod="openstack/root-account-create-update-4qg94" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.720493 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-dq9fp"] Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.723189 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.776781 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-dq9fp"] Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.809776 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vprjn\" (UniqueName: \"kubernetes.io/projected/9e4ee4bc-4d4e-4227-b817-3de2e3860581-kube-api-access-vprjn\") pod \"root-account-create-update-4qg94\" (UID: \"9e4ee4bc-4d4e-4227-b817-3de2e3860581\") " pod="openstack/root-account-create-update-4qg94" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.860574 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.860766 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-scripts\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.860853 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-credential-keys\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.860958 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-combined-ca-bundle\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.860986 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-fernet-keys\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.861070 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-svc\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.861097 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwn9v\" (UniqueName: \"kubernetes.io/projected/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-kube-api-access-vwn9v\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.861335 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.861416 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-config-data\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.861451 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.861533 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-config\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.861589 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kwds\" (UniqueName: \"kubernetes.io/projected/217731b1-7352-47f8-a734-d0fa2ac45ab1-kube-api-access-8kwds\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.951198 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-l79d2"] Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.952427 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-l79d2" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.957183 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.957216 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tm9bd" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.957271 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.963664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-scripts\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.963779 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-credential-keys\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.963843 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-combined-ca-bundle\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.963863 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-fernet-keys\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.963901 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-svc\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.963920 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwn9v\" (UniqueName: \"kubernetes.io/projected/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-kube-api-access-vwn9v\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.963945 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.963964 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-config-data\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.963983 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.964012 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-config\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.964081 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kwds\" (UniqueName: \"kubernetes.io/projected/217731b1-7352-47f8-a734-d0fa2ac45ab1-kube-api-access-8kwds\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.964125 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.966857 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-l79d2"] Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.969715 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.971257 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.971826 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-scripts\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.972012 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-svc\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.972678 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.972869 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-config\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.996538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-combined-ca-bundle\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:01 crc kubenswrapper[4773]: I0121 15:46:01.997409 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-config-data\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.005484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-credential-keys\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.009316 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-fernet-keys\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.009379 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-6bhfp"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.010481 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.025630 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kwds\" (UniqueName: \"kubernetes.io/projected/217731b1-7352-47f8-a734-d0fa2ac45ab1-kube-api-access-8kwds\") pod \"dnsmasq-dns-5959f8865f-dq9fp\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.026299 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.026534 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-x8xpq" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.027496 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.035270 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwn9v\" (UniqueName: \"kubernetes.io/projected/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-kube-api-access-vwn9v\") pod \"keystone-bootstrap-p67qs\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.037353 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6bhfp"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.065369 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.067709 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-combined-ca-bundle\") pod \"neutron-db-sync-l79d2\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " pod="openstack/neutron-db-sync-l79d2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.067761 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t484m\" (UniqueName: \"kubernetes.io/projected/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-kube-api-access-t484m\") pod \"neutron-db-sync-l79d2\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " pod="openstack/neutron-db-sync-l79d2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.067788 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-config\") pod \"neutron-db-sync-l79d2\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " pod="openstack/neutron-db-sync-l79d2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.071180 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4qg94" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.082293 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-dq9fp"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.169827 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxjsg\" (UniqueName: \"kubernetes.io/projected/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-kube-api-access-gxjsg\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.169914 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-config-data\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.169947 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-scripts\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.170027 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-logs\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.170078 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-combined-ca-bundle\") pod \"neutron-db-sync-l79d2\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " pod="openstack/neutron-db-sync-l79d2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.170106 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-combined-ca-bundle\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.170145 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t484m\" (UniqueName: \"kubernetes.io/projected/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-kube-api-access-t484m\") pod \"neutron-db-sync-l79d2\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " pod="openstack/neutron-db-sync-l79d2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.170179 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-config\") pod \"neutron-db-sync-l79d2\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " pod="openstack/neutron-db-sync-l79d2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.180140 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-77kxr"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.181922 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.187889 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-combined-ca-bundle\") pod \"neutron-db-sync-l79d2\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " pod="openstack/neutron-db-sync-l79d2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.192000 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.208800 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-config\") pod \"neutron-db-sync-l79d2\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " pod="openstack/neutron-db-sync-l79d2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.209272 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.216505 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.216772 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-x9snq" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.229441 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-77kxr"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.230155 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t484m\" (UniqueName: \"kubernetes.io/projected/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-kube-api-access-t484m\") pod \"neutron-db-sync-l79d2\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " pod="openstack/neutron-db-sync-l79d2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.240899 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-l79d2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.262025 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ssd99"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.263807 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.272357 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxjsg\" (UniqueName: \"kubernetes.io/projected/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-kube-api-access-gxjsg\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.272426 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-config-data\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.272459 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-scripts\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.272551 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-logs\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.272599 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-combined-ca-bundle\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.278768 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-logs\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.280377 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-scripts\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.284366 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-config-data\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.286907 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-combined-ca-bundle\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.299819 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-n2bxp"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.301186 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.305266 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.305474 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxjsg\" (UniqueName: \"kubernetes.io/projected/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-kube-api-access-gxjsg\") pod \"placement-db-sync-6bhfp\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.313293 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rwbpc" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.314149 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.334628 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-n2bxp"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.362210 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-znjg2"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.363571 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.367315 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lp86k" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.367928 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.368394 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.376330 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-scripts\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.376424 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-config\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.376528 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.376585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.376629 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrlkd\" (UniqueName: \"kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-kube-api-access-hrlkd\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.376733 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.376761 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-combined-ca-bundle\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.376794 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.376831 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxhj2\" (UniqueName: \"kubernetes.io/projected/2daf1511-7c23-47cc-900d-07ed88b573c3-kube-api-access-jxhj2\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.376859 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-config-data\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.376881 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-certs\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.411287 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ssd99"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.440781 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-znjg2"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.455081 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.457332 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.462152 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.462414 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481190 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrlkd\" (UniqueName: \"kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-kube-api-access-hrlkd\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481273 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-config-data\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481298 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-combined-ca-bundle\") pod \"barbican-db-sync-n2bxp\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481332 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481353 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-combined-ca-bundle\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481375 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481396 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-scripts\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481417 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-combined-ca-bundle\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481439 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxhj2\" (UniqueName: \"kubernetes.io/projected/2daf1511-7c23-47cc-900d-07ed88b573c3-kube-api-access-jxhj2\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481455 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-config-data\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481473 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-certs\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481500 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mjff\" (UniqueName: \"kubernetes.io/projected/5c0beb28-481c-4507-94c2-d644e4faf5ab-kube-api-access-4mjff\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481528 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-db-sync-config-data\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-scripts\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481603 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-config\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481626 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8bhs\" (UniqueName: \"kubernetes.io/projected/3cd3f2b8-c365-4845-8508-2403d3b1f03e-kube-api-access-z8bhs\") pod \"barbican-db-sync-n2bxp\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-db-sync-config-data\") pod \"barbican-db-sync-n2bxp\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481794 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481831 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.481850 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c0beb28-481c-4507-94c2-d644e4faf5ab-etc-machine-id\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.483103 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.493976 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-config\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.494334 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-combined-ca-bundle\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.495207 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.495278 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.495895 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.496411 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.507188 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-certs\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.507617 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-scripts\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.519864 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxhj2\" (UniqueName: \"kubernetes.io/projected/2daf1511-7c23-47cc-900d-07ed88b573c3-kube-api-access-jxhj2\") pod \"dnsmasq-dns-58dd9ff6bc-ssd99\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.519951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-config-data\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.565149 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrlkd\" (UniqueName: \"kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-kube-api-access-hrlkd\") pod \"cloudkitty-db-sync-77kxr\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.573139 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6bhfp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.584102 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8bhs\" (UniqueName: \"kubernetes.io/projected/3cd3f2b8-c365-4845-8508-2403d3b1f03e-kube-api-access-z8bhs\") pod \"barbican-db-sync-n2bxp\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.584189 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-db-sync-config-data\") pod \"barbican-db-sync-n2bxp\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.584212 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-config-data\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.584251 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c0beb28-481c-4507-94c2-d644e4faf5ab-etc-machine-id\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.587758 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c0beb28-481c-4507-94c2-d644e4faf5ab-etc-machine-id\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590447 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-config-data\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590506 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-combined-ca-bundle\") pod \"barbican-db-sync-n2bxp\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590564 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfxsm\" (UniqueName: \"kubernetes.io/projected/51eaf63a-c7b4-47eb-8357-e746bd703b64-kube-api-access-lfxsm\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590644 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-scripts\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590661 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590687 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-scripts\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590752 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-combined-ca-bundle\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590835 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-run-httpd\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590865 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mjff\" (UniqueName: \"kubernetes.io/projected/5c0beb28-481c-4507-94c2-d644e4faf5ab-kube-api-access-4mjff\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590884 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-log-httpd\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590916 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.590952 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-db-sync-config-data\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.595498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-db-sync-config-data\") pod \"barbican-db-sync-n2bxp\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.597226 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.612925 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-combined-ca-bundle\") pod \"barbican-db-sync-n2bxp\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.637115 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-combined-ca-bundle\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.644249 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-db-sync-config-data\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.646543 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-config-data\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.647564 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-scripts\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.648292 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.648467 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8bhs\" (UniqueName: \"kubernetes.io/projected/3cd3f2b8-c365-4845-8508-2403d3b1f03e-kube-api-access-z8bhs\") pod \"barbican-db-sync-n2bxp\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.652289 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mjff\" (UniqueName: \"kubernetes.io/projected/5c0beb28-481c-4507-94c2-d644e4faf5ab-kube-api-access-4mjff\") pod \"cinder-db-sync-znjg2\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.685316 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.694733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-config-data\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.694961 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfxsm\" (UniqueName: \"kubernetes.io/projected/51eaf63a-c7b4-47eb-8357-e746bd703b64-kube-api-access-lfxsm\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.695037 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-scripts\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.695064 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.695126 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-run-httpd\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.695153 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-log-httpd\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.695186 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.700783 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-run-httpd\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.701391 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-log-httpd\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.706596 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-config-data\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.707216 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.707688 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-scripts\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.713205 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-znjg2" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.715090 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.759835 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfxsm\" (UniqueName: \"kubernetes.io/projected/51eaf63a-c7b4-47eb-8357-e746bd703b64-kube-api-access-lfxsm\") pod \"ceilometer-0\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.844390 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:46:02 crc kubenswrapper[4773]: I0121 15:46:02.930284 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-dq9fp"] Jan 21 15:46:03 crc kubenswrapper[4773]: I0121 15:46:03.263757 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4qg94"] Jan 21 15:46:03 crc kubenswrapper[4773]: I0121 15:46:03.528212 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4qg94" event={"ID":"9e4ee4bc-4d4e-4227-b817-3de2e3860581","Type":"ContainerStarted","Data":"aa96b7b91b8d89d5d062e9c46ee52c50c9f3b75a2a6767ae85962cd442a993a5"} Jan 21 15:46:03 crc kubenswrapper[4773]: I0121 15:46:03.537745 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" podUID="10181654-0fe4-4966-92d5-821254a0c1dc" containerName="dnsmasq-dns" containerID="cri-o://c4cc05f13561187d5c5ed89d58971a1a03cae2f3f9bb826c9d25720de85847aa" gracePeriod=10 Jan 21 15:46:03 crc kubenswrapper[4773]: I0121 15:46:03.537897 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" event={"ID":"217731b1-7352-47f8-a734-d0fa2ac45ab1","Type":"ContainerStarted","Data":"59315133710a3a09d8df9f876225032fb4b3436df7b3b0683ef4260c384f327c"} Jan 21 15:46:03 crc kubenswrapper[4773]: I0121 15:46:03.636500 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-77kxr"] Jan 21 15:46:03 crc kubenswrapper[4773]: I0121 15:46:03.646475 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ssd99"] Jan 21 15:46:03 crc kubenswrapper[4773]: I0121 15:46:03.663946 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p67qs"] Jan 21 15:46:03 crc kubenswrapper[4773]: I0121 15:46:03.705390 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-l79d2"] Jan 21 15:46:03 crc kubenswrapper[4773]: I0121 15:46:03.861516 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6bhfp"] Jan 21 15:46:03 crc kubenswrapper[4773]: I0121 15:46:03.957832 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-n2bxp"] Jan 21 15:46:04 crc kubenswrapper[4773]: W0121 15:46:04.003272 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cd3f2b8_c365_4845_8508_2403d3b1f03e.slice/crio-8196ff7f96be10da8345381f2dbc075b8a9493a30a2ae8ddec8c79a7c5d62471 WatchSource:0}: Error finding container 8196ff7f96be10da8345381f2dbc075b8a9493a30a2ae8ddec8c79a7c5d62471: Status 404 returned error can't find the container with id 8196ff7f96be10da8345381f2dbc075b8a9493a30a2ae8ddec8c79a7c5d62471 Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.095598 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.129060 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-znjg2"] Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.180946 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.189995 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.395440 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.575116 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4qg94" event={"ID":"9e4ee4bc-4d4e-4227-b817-3de2e3860581","Type":"ContainerStarted","Data":"3f99f3742eccde965c85b8c63b109f6acf3d18e10c52777ce3abb069cdac81f8"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.583826 4773 generic.go:334] "Generic (PLEG): container finished" podID="10181654-0fe4-4966-92d5-821254a0c1dc" containerID="c4cc05f13561187d5c5ed89d58971a1a03cae2f3f9bb826c9d25720de85847aa" exitCode=0 Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.583900 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" event={"ID":"10181654-0fe4-4966-92d5-821254a0c1dc","Type":"ContainerDied","Data":"c4cc05f13561187d5c5ed89d58971a1a03cae2f3f9bb826c9d25720de85847aa"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.588397 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51eaf63a-c7b4-47eb-8357-e746bd703b64","Type":"ContainerStarted","Data":"a0d0ee75d0b94890be51f8aab3c9a286fc0c910dfc48815c76051383495fa26d"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.597244 4773 generic.go:334] "Generic (PLEG): container finished" podID="2daf1511-7c23-47cc-900d-07ed88b573c3" containerID="599e0869d6316a92a42f1b552d07a7c5804b159674b137b042200e74126d9506" exitCode=0 Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.597345 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" event={"ID":"2daf1511-7c23-47cc-900d-07ed88b573c3","Type":"ContainerDied","Data":"599e0869d6316a92a42f1b552d07a7c5804b159674b137b042200e74126d9506"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.597387 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" event={"ID":"2daf1511-7c23-47cc-900d-07ed88b573c3","Type":"ContainerStarted","Data":"ea4d74e71ead718925525104cae50f5fe9b2cea5f266e4bf324c9b4e37c39803"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.608576 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-l79d2" event={"ID":"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd","Type":"ContainerStarted","Data":"5d733e8c68f4ba219ff27620276d938a782447f1b4e38ce0f63ddc39f8b58720"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.608622 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-l79d2" event={"ID":"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd","Type":"ContainerStarted","Data":"68fe6b76b8445a66ffdce0e01adc48929d8fa5180ccac9fa976e47ffc76b2336"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.611830 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-znjg2" event={"ID":"5c0beb28-481c-4507-94c2-d644e4faf5ab","Type":"ContainerStarted","Data":"06a9dd01ae16972006c7663ab8cb0e550476406076303d343aba67dcfe4a7537"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.613107 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-77kxr" event={"ID":"de6a84b1-1846-4dd0-be7f-47a8872227ff","Type":"ContainerStarted","Data":"b3cfdc61907b073f19e71c6eab39ed7e0fd17bc6884b3741d397ed7c6fc15d19"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.640418 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p67qs" event={"ID":"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b","Type":"ContainerStarted","Data":"100540030afd95c37d49ec2b595b7be762f7d926f3f01ca5f118870dc6d7d9ef"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.640478 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p67qs" event={"ID":"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b","Type":"ContainerStarted","Data":"762fa444b1a41d3d7fca4fa6008cf3eb127051371275e2c6b18167e0f0807843"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.657812 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2bxp" event={"ID":"3cd3f2b8-c365-4845-8508-2403d3b1f03e","Type":"ContainerStarted","Data":"8196ff7f96be10da8345381f2dbc075b8a9493a30a2ae8ddec8c79a7c5d62471"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.663386 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6bhfp" event={"ID":"b12aa4f9-2fe2-4bfd-b764-3755131eb10a","Type":"ContainerStarted","Data":"10e77be285db4fee662b084f3144e6e738f12f672d39f74b5a0d231562e848fe"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.681709 4773 generic.go:334] "Generic (PLEG): container finished" podID="217731b1-7352-47f8-a734-d0fa2ac45ab1" containerID="ae5d63a87d6411b0c6cb97e635a6533ab765358290709d577945242d5a74dc1d" exitCode=0 Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.683184 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" event={"ID":"217731b1-7352-47f8-a734-d0fa2ac45ab1","Type":"ContainerDied","Data":"ae5d63a87d6411b0c6cb97e635a6533ab765358290709d577945242d5a74dc1d"} Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.684649 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-4qg94" podStartSLOduration=3.684629132 podStartE2EDuration="3.684629132s" podCreationTimestamp="2026-01-21 15:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:04.594532815 +0000 UTC m=+1329.519022437" watchObservedRunningTime="2026-01-21 15:46:04.684629132 +0000 UTC m=+1329.609118774" Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.697969 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.743366 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-l79d2" podStartSLOduration=3.743347791 podStartE2EDuration="3.743347791s" podCreationTimestamp="2026-01-21 15:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:04.696319603 +0000 UTC m=+1329.620809225" watchObservedRunningTime="2026-01-21 15:46:04.743347791 +0000 UTC m=+1329.667837413" Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.824003 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-p67qs" podStartSLOduration=3.823976579 podStartE2EDuration="3.823976579s" podCreationTimestamp="2026-01-21 15:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:04.71815312 +0000 UTC m=+1329.642642752" watchObservedRunningTime="2026-01-21 15:46:04.823976579 +0000 UTC m=+1329.748466201" Jan 21 15:46:04 crc kubenswrapper[4773]: I0121 15:46:04.986395 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.104590 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-nb\") pod \"10181654-0fe4-4966-92d5-821254a0c1dc\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.104841 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt964\" (UniqueName: \"kubernetes.io/projected/10181654-0fe4-4966-92d5-821254a0c1dc-kube-api-access-kt964\") pod \"10181654-0fe4-4966-92d5-821254a0c1dc\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.104944 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-config\") pod \"10181654-0fe4-4966-92d5-821254a0c1dc\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.105057 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-svc\") pod \"10181654-0fe4-4966-92d5-821254a0c1dc\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.105157 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-sb\") pod \"10181654-0fe4-4966-92d5-821254a0c1dc\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.105445 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-swift-storage-0\") pod \"10181654-0fe4-4966-92d5-821254a0c1dc\" (UID: \"10181654-0fe4-4966-92d5-821254a0c1dc\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.169900 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10181654-0fe4-4966-92d5-821254a0c1dc-kube-api-access-kt964" (OuterVolumeSpecName: "kube-api-access-kt964") pod "10181654-0fe4-4966-92d5-821254a0c1dc" (UID: "10181654-0fe4-4966-92d5-821254a0c1dc"). InnerVolumeSpecName "kube-api-access-kt964". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.226955 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt964\" (UniqueName: \"kubernetes.io/projected/10181654-0fe4-4966-92d5-821254a0c1dc-kube-api-access-kt964\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.333233 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10181654-0fe4-4966-92d5-821254a0c1dc" (UID: "10181654-0fe4-4966-92d5-821254a0c1dc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.345335 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-config" (OuterVolumeSpecName: "config") pod "10181654-0fe4-4966-92d5-821254a0c1dc" (UID: "10181654-0fe4-4966-92d5-821254a0c1dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.359855 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10181654-0fe4-4966-92d5-821254a0c1dc" (UID: "10181654-0fe4-4966-92d5-821254a0c1dc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.440097 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.440421 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.440432 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.519059 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "10181654-0fe4-4966-92d5-821254a0c1dc" (UID: "10181654-0fe4-4966-92d5-821254a0c1dc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.556222 4773 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.600656 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10181654-0fe4-4966-92d5-821254a0c1dc" (UID: "10181654-0fe4-4966-92d5-821254a0c1dc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.658263 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10181654-0fe4-4966-92d5-821254a0c1dc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.697237 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.740745 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" event={"ID":"2daf1511-7c23-47cc-900d-07ed88b573c3","Type":"ContainerStarted","Data":"4e8e03a30ace383bb1100d158f9fad186856f66547281aa119f4cac36eff50e8"} Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.740882 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.764579 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-swift-storage-0\") pod \"217731b1-7352-47f8-a734-d0fa2ac45ab1\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.764872 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-sb\") pod \"217731b1-7352-47f8-a734-d0fa2ac45ab1\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.764956 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" event={"ID":"217731b1-7352-47f8-a734-d0fa2ac45ab1","Type":"ContainerDied","Data":"59315133710a3a09d8df9f876225032fb4b3436df7b3b0683ef4260c384f327c"} Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.765002 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-config\") pod \"217731b1-7352-47f8-a734-d0fa2ac45ab1\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.765044 4773 scope.go:117] "RemoveContainer" containerID="ae5d63a87d6411b0c6cb97e635a6533ab765358290709d577945242d5a74dc1d" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.765051 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-svc\") pod \"217731b1-7352-47f8-a734-d0fa2ac45ab1\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.765288 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kwds\" (UniqueName: \"kubernetes.io/projected/217731b1-7352-47f8-a734-d0fa2ac45ab1-kube-api-access-8kwds\") pod \"217731b1-7352-47f8-a734-d0fa2ac45ab1\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.765395 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-nb\") pod \"217731b1-7352-47f8-a734-d0fa2ac45ab1\" (UID: \"217731b1-7352-47f8-a734-d0fa2ac45ab1\") " Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.767739 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-dq9fp" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.792512 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" podStartSLOduration=3.7924909380000003 podStartE2EDuration="3.792490938s" podCreationTimestamp="2026-01-21 15:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:05.776224623 +0000 UTC m=+1330.700714265" watchObservedRunningTime="2026-01-21 15:46:05.792490938 +0000 UTC m=+1330.716980560" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.792939 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217731b1-7352-47f8-a734-d0fa2ac45ab1-kube-api-access-8kwds" (OuterVolumeSpecName: "kube-api-access-8kwds") pod "217731b1-7352-47f8-a734-d0fa2ac45ab1" (UID: "217731b1-7352-47f8-a734-d0fa2ac45ab1"). InnerVolumeSpecName "kube-api-access-8kwds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.796086 4773 generic.go:334] "Generic (PLEG): container finished" podID="9e4ee4bc-4d4e-4227-b817-3de2e3860581" containerID="3f99f3742eccde965c85b8c63b109f6acf3d18e10c52777ce3abb069cdac81f8" exitCode=0 Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.796161 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4qg94" event={"ID":"9e4ee4bc-4d4e-4227-b817-3de2e3860581","Type":"ContainerDied","Data":"3f99f3742eccde965c85b8c63b109f6acf3d18e10c52777ce3abb069cdac81f8"} Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.803555 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "217731b1-7352-47f8-a734-d0fa2ac45ab1" (UID: "217731b1-7352-47f8-a734-d0fa2ac45ab1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.807797 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "217731b1-7352-47f8-a734-d0fa2ac45ab1" (UID: "217731b1-7352-47f8-a734-d0fa2ac45ab1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.820289 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.821012 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-ddrwp" event={"ID":"10181654-0fe4-4966-92d5-821254a0c1dc","Type":"ContainerDied","Data":"f284c48d5487d8050c692f03d45c19b08ca63e1a0bddc812483a7d011246a3b3"} Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.850593 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "217731b1-7352-47f8-a734-d0fa2ac45ab1" (UID: "217731b1-7352-47f8-a734-d0fa2ac45ab1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.860555 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-config" (OuterVolumeSpecName: "config") pod "217731b1-7352-47f8-a734-d0fa2ac45ab1" (UID: "217731b1-7352-47f8-a734-d0fa2ac45ab1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.864170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "217731b1-7352-47f8-a734-d0fa2ac45ab1" (UID: "217731b1-7352-47f8-a734-d0fa2ac45ab1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.871918 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.871952 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.871964 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.871978 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kwds\" (UniqueName: \"kubernetes.io/projected/217731b1-7352-47f8-a734-d0fa2ac45ab1-kube-api-access-8kwds\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.871991 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.872002 4773 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/217731b1-7352-47f8-a734-d0fa2ac45ab1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.923098 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-ddrwp"] Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.926402 4773 scope.go:117] "RemoveContainer" containerID="c4cc05f13561187d5c5ed89d58971a1a03cae2f3f9bb826c9d25720de85847aa" Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.936158 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-ddrwp"] Jan 21 15:46:05 crc kubenswrapper[4773]: I0121 15:46:05.986858 4773 scope.go:117] "RemoveContainer" containerID="a043764c953e9f3659ed8d550a61a6122f4e312733c61090b1c544adb810db22" Jan 21 15:46:06 crc kubenswrapper[4773]: I0121 15:46:06.233627 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-dq9fp"] Jan 21 15:46:06 crc kubenswrapper[4773]: I0121 15:46:06.246322 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-dq9fp"] Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.400358 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10181654-0fe4-4966-92d5-821254a0c1dc" path="/var/lib/kubelet/pods/10181654-0fe4-4966-92d5-821254a0c1dc/volumes" Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.403149 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217731b1-7352-47f8-a734-d0fa2ac45ab1" path="/var/lib/kubelet/pods/217731b1-7352-47f8-a734-d0fa2ac45ab1/volumes" Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.473019 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4qg94" Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.521810 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vprjn\" (UniqueName: \"kubernetes.io/projected/9e4ee4bc-4d4e-4227-b817-3de2e3860581-kube-api-access-vprjn\") pod \"9e4ee4bc-4d4e-4227-b817-3de2e3860581\" (UID: \"9e4ee4bc-4d4e-4227-b817-3de2e3860581\") " Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.521861 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4ee4bc-4d4e-4227-b817-3de2e3860581-operator-scripts\") pod \"9e4ee4bc-4d4e-4227-b817-3de2e3860581\" (UID: \"9e4ee4bc-4d4e-4227-b817-3de2e3860581\") " Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.522425 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4ee4bc-4d4e-4227-b817-3de2e3860581-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e4ee4bc-4d4e-4227-b817-3de2e3860581" (UID: "9e4ee4bc-4d4e-4227-b817-3de2e3860581"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.530249 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e4ee4bc-4d4e-4227-b817-3de2e3860581-kube-api-access-vprjn" (OuterVolumeSpecName: "kube-api-access-vprjn") pod "9e4ee4bc-4d4e-4227-b817-3de2e3860581" (UID: "9e4ee4bc-4d4e-4227-b817-3de2e3860581"). InnerVolumeSpecName "kube-api-access-vprjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.628955 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vprjn\" (UniqueName: \"kubernetes.io/projected/9e4ee4bc-4d4e-4227-b817-3de2e3860581-kube-api-access-vprjn\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.629019 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4ee4bc-4d4e-4227-b817-3de2e3860581-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.856507 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4qg94" event={"ID":"9e4ee4bc-4d4e-4227-b817-3de2e3860581","Type":"ContainerDied","Data":"aa96b7b91b8d89d5d062e9c46ee52c50c9f3b75a2a6767ae85962cd442a993a5"} Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.856550 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa96b7b91b8d89d5d062e9c46ee52c50c9f3b75a2a6767ae85962cd442a993a5" Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.856602 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4qg94" Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.910444 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-4qg94"] Jan 21 15:46:07 crc kubenswrapper[4773]: I0121 15:46:07.921981 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-4qg94"] Jan 21 15:46:09 crc kubenswrapper[4773]: I0121 15:46:09.401550 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e4ee4bc-4d4e-4227-b817-3de2e3860581" path="/var/lib/kubelet/pods/9e4ee4bc-4d4e-4227-b817-3de2e3860581/volumes" Jan 21 15:46:09 crc kubenswrapper[4773]: I0121 15:46:09.897568 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7vcf9" event={"ID":"3ee2313d-678e-487c-a4af-ae303d40bedd","Type":"ContainerStarted","Data":"3714d8c788885f1482f8ecf2d8a36732d8580ef97d5ff098dbfddd4e34f09975"} Jan 21 15:46:09 crc kubenswrapper[4773]: I0121 15:46:09.900376 4773 generic.go:334] "Generic (PLEG): container finished" podID="f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b" containerID="100540030afd95c37d49ec2b595b7be762f7d926f3f01ca5f118870dc6d7d9ef" exitCode=0 Jan 21 15:46:09 crc kubenswrapper[4773]: I0121 15:46:09.900421 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p67qs" event={"ID":"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b","Type":"ContainerDied","Data":"100540030afd95c37d49ec2b595b7be762f7d926f3f01ca5f118870dc6d7d9ef"} Jan 21 15:46:10 crc kubenswrapper[4773]: I0121 15:46:10.945492 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-7vcf9" podStartSLOduration=3.962201448 podStartE2EDuration="41.945471587s" podCreationTimestamp="2026-01-21 15:45:29 +0000 UTC" firstStartedPulling="2026-01-21 15:45:30.856409337 +0000 UTC m=+1295.780898959" lastFinishedPulling="2026-01-21 15:46:08.839679486 +0000 UTC m=+1333.764169098" observedRunningTime="2026-01-21 15:46:10.936202242 +0000 UTC m=+1335.860691864" watchObservedRunningTime="2026-01-21 15:46:10.945471587 +0000 UTC m=+1335.869961209" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.479810 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zvwk6"] Jan 21 15:46:11 crc kubenswrapper[4773]: E0121 15:46:11.480549 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217731b1-7352-47f8-a734-d0fa2ac45ab1" containerName="init" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.480565 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="217731b1-7352-47f8-a734-d0fa2ac45ab1" containerName="init" Jan 21 15:46:11 crc kubenswrapper[4773]: E0121 15:46:11.480586 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10181654-0fe4-4966-92d5-821254a0c1dc" containerName="init" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.480594 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="10181654-0fe4-4966-92d5-821254a0c1dc" containerName="init" Jan 21 15:46:11 crc kubenswrapper[4773]: E0121 15:46:11.480624 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10181654-0fe4-4966-92d5-821254a0c1dc" containerName="dnsmasq-dns" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.480634 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="10181654-0fe4-4966-92d5-821254a0c1dc" containerName="dnsmasq-dns" Jan 21 15:46:11 crc kubenswrapper[4773]: E0121 15:46:11.480649 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e4ee4bc-4d4e-4227-b817-3de2e3860581" containerName="mariadb-account-create-update" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.480656 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e4ee4bc-4d4e-4227-b817-3de2e3860581" containerName="mariadb-account-create-update" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.480886 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="217731b1-7352-47f8-a734-d0fa2ac45ab1" containerName="init" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.480919 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e4ee4bc-4d4e-4227-b817-3de2e3860581" containerName="mariadb-account-create-update" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.480932 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="10181654-0fe4-4966-92d5-821254a0c1dc" containerName="dnsmasq-dns" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.481720 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zvwk6" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.484135 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.506869 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zvwk6"] Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.552995 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7bfq\" (UniqueName: \"kubernetes.io/projected/f88d424e-1f05-4033-95a2-fff161616fa1-kube-api-access-c7bfq\") pod \"root-account-create-update-zvwk6\" (UID: \"f88d424e-1f05-4033-95a2-fff161616fa1\") " pod="openstack/root-account-create-update-zvwk6" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.554801 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f88d424e-1f05-4033-95a2-fff161616fa1-operator-scripts\") pod \"root-account-create-update-zvwk6\" (UID: \"f88d424e-1f05-4033-95a2-fff161616fa1\") " pod="openstack/root-account-create-update-zvwk6" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.657655 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7bfq\" (UniqueName: \"kubernetes.io/projected/f88d424e-1f05-4033-95a2-fff161616fa1-kube-api-access-c7bfq\") pod \"root-account-create-update-zvwk6\" (UID: \"f88d424e-1f05-4033-95a2-fff161616fa1\") " pod="openstack/root-account-create-update-zvwk6" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.657854 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f88d424e-1f05-4033-95a2-fff161616fa1-operator-scripts\") pod \"root-account-create-update-zvwk6\" (UID: \"f88d424e-1f05-4033-95a2-fff161616fa1\") " pod="openstack/root-account-create-update-zvwk6" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.658789 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f88d424e-1f05-4033-95a2-fff161616fa1-operator-scripts\") pod \"root-account-create-update-zvwk6\" (UID: \"f88d424e-1f05-4033-95a2-fff161616fa1\") " pod="openstack/root-account-create-update-zvwk6" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.704913 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7bfq\" (UniqueName: \"kubernetes.io/projected/f88d424e-1f05-4033-95a2-fff161616fa1-kube-api-access-c7bfq\") pod \"root-account-create-update-zvwk6\" (UID: \"f88d424e-1f05-4033-95a2-fff161616fa1\") " pod="openstack/root-account-create-update-zvwk6" Jan 21 15:46:11 crc kubenswrapper[4773]: I0121 15:46:11.802180 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zvwk6" Jan 21 15:46:12 crc kubenswrapper[4773]: I0121 15:46:12.650924 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:46:12 crc kubenswrapper[4773]: I0121 15:46:12.723476 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-vlcdh"] Jan 21 15:46:12 crc kubenswrapper[4773]: I0121 15:46:12.723719 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-vlcdh" podUID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" containerName="dnsmasq-dns" containerID="cri-o://a57b464d3678b7bf523449d37eae262c473608887ea71321500fa2c91e35769e" gracePeriod=10 Jan 21 15:46:12 crc kubenswrapper[4773]: I0121 15:46:12.789530 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-vlcdh" podUID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.604404 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.638454 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-combined-ca-bundle\") pod \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.638619 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-fernet-keys\") pod \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.638758 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwn9v\" (UniqueName: \"kubernetes.io/projected/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-kube-api-access-vwn9v\") pod \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.638848 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-credential-keys\") pod \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.638898 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-scripts\") pod \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.638926 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-config-data\") pod \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\" (UID: \"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b\") " Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.644600 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b" (UID: "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.644999 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-scripts" (OuterVolumeSpecName: "scripts") pod "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b" (UID: "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.645341 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-kube-api-access-vwn9v" (OuterVolumeSpecName: "kube-api-access-vwn9v") pod "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b" (UID: "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b"). InnerVolumeSpecName "kube-api-access-vwn9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.651127 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b" (UID: "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.678887 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-config-data" (OuterVolumeSpecName: "config-data") pod "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b" (UID: "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.683443 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b" (UID: "f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.741392 4773 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.741437 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.741448 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.741458 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.741469 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.741479 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwn9v\" (UniqueName: \"kubernetes.io/projected/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b-kube-api-access-vwn9v\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.998129 4773 generic.go:334] "Generic (PLEG): container finished" podID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" containerID="a57b464d3678b7bf523449d37eae262c473608887ea71321500fa2c91e35769e" exitCode=0 Jan 21 15:46:15 crc kubenswrapper[4773]: I0121 15:46:15.998180 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-vlcdh" event={"ID":"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd","Type":"ContainerDied","Data":"a57b464d3678b7bf523449d37eae262c473608887ea71321500fa2c91e35769e"} Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.000460 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p67qs" event={"ID":"f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b","Type":"ContainerDied","Data":"762fa444b1a41d3d7fca4fa6008cf3eb127051371275e2c6b18167e0f0807843"} Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.000511 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="762fa444b1a41d3d7fca4fa6008cf3eb127051371275e2c6b18167e0f0807843" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.000575 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p67qs" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.685490 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-p67qs"] Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.699329 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-p67qs"] Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.786423 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6x5cx"] Jan 21 15:46:16 crc kubenswrapper[4773]: E0121 15:46:16.786825 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b" containerName="keystone-bootstrap" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.786836 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b" containerName="keystone-bootstrap" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.787017 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b" containerName="keystone-bootstrap" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.787638 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.790518 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.790582 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.790727 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.790887 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sf6hv" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.804932 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6x5cx"] Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.881451 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-combined-ca-bundle\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.881640 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-scripts\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.881876 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrqgw\" (UniqueName: \"kubernetes.io/projected/3eae6f1f-bc67-4acc-836b-68396e478669-kube-api-access-jrqgw\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.881920 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-credential-keys\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.881983 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-fernet-keys\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.882116 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-config-data\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.983532 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrqgw\" (UniqueName: \"kubernetes.io/projected/3eae6f1f-bc67-4acc-836b-68396e478669-kube-api-access-jrqgw\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.983595 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-credential-keys\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.983664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-fernet-keys\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.983733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-config-data\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.983770 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-combined-ca-bundle\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.983806 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-scripts\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.988912 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-scripts\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.989663 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-credential-keys\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.990114 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-config-data\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.998314 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-fernet-keys\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:16 crc kubenswrapper[4773]: I0121 15:46:16.998918 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-combined-ca-bundle\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:17 crc kubenswrapper[4773]: I0121 15:46:17.003951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrqgw\" (UniqueName: \"kubernetes.io/projected/3eae6f1f-bc67-4acc-836b-68396e478669-kube-api-access-jrqgw\") pod \"keystone-bootstrap-6x5cx\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:17 crc kubenswrapper[4773]: I0121 15:46:17.119329 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:46:17 crc kubenswrapper[4773]: I0121 15:46:17.396684 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b" path="/var/lib/kubelet/pods/f78fc5a1-0a68-4a15-8fb5-0e1800a4aa7b/volumes" Jan 21 15:46:17 crc kubenswrapper[4773]: I0121 15:46:17.784974 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-vlcdh" podUID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Jan 21 15:46:18 crc kubenswrapper[4773]: E0121 15:46:18.793175 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Jan 21 15:46:18 crc kubenswrapper[4773]: E0121 15:46:18.793338 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gxjsg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-6bhfp_openstack(b12aa4f9-2fe2-4bfd-b764-3755131eb10a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:46:18 crc kubenswrapper[4773]: E0121 15:46:18.794582 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-6bhfp" podUID="b12aa4f9-2fe2-4bfd-b764-3755131eb10a" Jan 21 15:46:19 crc kubenswrapper[4773]: E0121 15:46:19.048454 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-6bhfp" podUID="b12aa4f9-2fe2-4bfd-b764-3755131eb10a" Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.618490 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.726477 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-nb\") pod \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.726597 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-config\") pod \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.726657 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzzr9\" (UniqueName: \"kubernetes.io/projected/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-kube-api-access-bzzr9\") pod \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.726736 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-dns-svc\") pod \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.726884 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-sb\") pod \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\" (UID: \"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd\") " Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.741093 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-kube-api-access-bzzr9" (OuterVolumeSpecName: "kube-api-access-bzzr9") pod "e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" (UID: "e48bc43f-55bc-4b6a-bc8a-dac53e6549cd"). InnerVolumeSpecName "kube-api-access-bzzr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.783682 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" (UID: "e48bc43f-55bc-4b6a-bc8a-dac53e6549cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.795257 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" (UID: "e48bc43f-55bc-4b6a-bc8a-dac53e6549cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.796936 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-config" (OuterVolumeSpecName: "config") pod "e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" (UID: "e48bc43f-55bc-4b6a-bc8a-dac53e6549cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.807036 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" (UID: "e48bc43f-55bc-4b6a-bc8a-dac53e6549cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.829890 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.829934 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.829946 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.829959 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzzr9\" (UniqueName: \"kubernetes.io/projected/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-kube-api-access-bzzr9\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:24 crc kubenswrapper[4773]: I0121 15:46:24.829974 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:25 crc kubenswrapper[4773]: I0121 15:46:25.091088 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-vlcdh" event={"ID":"e48bc43f-55bc-4b6a-bc8a-dac53e6549cd","Type":"ContainerDied","Data":"f0fb75b378ce7bf7a9f4443448c5fb834dbf432b1fdb495aa024273dffee4b21"} Jan 21 15:46:25 crc kubenswrapper[4773]: I0121 15:46:25.091144 4773 scope.go:117] "RemoveContainer" containerID="a57b464d3678b7bf523449d37eae262c473608887ea71321500fa2c91e35769e" Jan 21 15:46:25 crc kubenswrapper[4773]: I0121 15:46:25.091156 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-vlcdh" Jan 21 15:46:25 crc kubenswrapper[4773]: I0121 15:46:25.129892 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-vlcdh"] Jan 21 15:46:25 crc kubenswrapper[4773]: I0121 15:46:25.139577 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-vlcdh"] Jan 21 15:46:25 crc kubenswrapper[4773]: I0121 15:46:25.206485 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:46:25 crc kubenswrapper[4773]: I0121 15:46:25.206542 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:46:25 crc kubenswrapper[4773]: I0121 15:46:25.406544 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" path="/var/lib/kubelet/pods/e48bc43f-55bc-4b6a-bc8a-dac53e6549cd/volumes" Jan 21 15:46:27 crc kubenswrapper[4773]: I0121 15:46:27.785116 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-vlcdh" podUID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Jan 21 15:46:37 crc kubenswrapper[4773]: E0121 15:46:37.757651 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 21 15:46:37 crc kubenswrapper[4773]: E0121 15:46:37.758253 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z8bhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-n2bxp_openstack(3cd3f2b8-c365-4845-8508-2403d3b1f03e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:46:37 crc kubenswrapper[4773]: E0121 15:46:37.759440 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-n2bxp" podUID="3cd3f2b8-c365-4845-8508-2403d3b1f03e" Jan 21 15:46:38 crc kubenswrapper[4773]: E0121 15:46:38.225655 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-n2bxp" podUID="3cd3f2b8-c365-4845-8508-2403d3b1f03e" Jan 21 15:46:41 crc kubenswrapper[4773]: E0121 15:46:41.426560 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 21 15:46:41 crc kubenswrapper[4773]: E0121 15:46:41.427152 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4mjff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-znjg2_openstack(5c0beb28-481c-4507-94c2-d644e4faf5ab): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:46:41 crc kubenswrapper[4773]: E0121 15:46:41.428327 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-znjg2" podUID="5c0beb28-481c-4507-94c2-d644e4faf5ab" Jan 21 15:46:42 crc kubenswrapper[4773]: E0121 15:46:42.256251 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-znjg2" podUID="5c0beb28-481c-4507-94c2-d644e4faf5ab" Jan 21 15:46:45 crc kubenswrapper[4773]: I0121 15:46:45.183237 4773 scope.go:117] "RemoveContainer" containerID="3314ab0a44d2eda36e92c163faa40a03e3341d9da1050bccfbe5b21921f56dc9" Jan 21 15:46:45 crc kubenswrapper[4773]: I0121 15:46:45.642593 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6x5cx"] Jan 21 15:46:45 crc kubenswrapper[4773]: I0121 15:46:45.737588 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zvwk6"] Jan 21 15:46:46 crc kubenswrapper[4773]: I0121 15:46:46.303683 4773 generic.go:334] "Generic (PLEG): container finished" podID="3ee2313d-678e-487c-a4af-ae303d40bedd" containerID="3714d8c788885f1482f8ecf2d8a36732d8580ef97d5ff098dbfddd4e34f09975" exitCode=0 Jan 21 15:46:46 crc kubenswrapper[4773]: I0121 15:46:46.303748 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7vcf9" event={"ID":"3ee2313d-678e-487c-a4af-ae303d40bedd","Type":"ContainerDied","Data":"3714d8c788885f1482f8ecf2d8a36732d8580ef97d5ff098dbfddd4e34f09975"} Jan 21 15:46:48 crc kubenswrapper[4773]: W0121 15:46:48.541564 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eae6f1f_bc67_4acc_836b_68396e478669.slice/crio-051ad8ce8bdb8d5f5aae1581a17d82a9fa10e67ef4b7739d629cda6729d9beb5 WatchSource:0}: Error finding container 051ad8ce8bdb8d5f5aae1581a17d82a9fa10e67ef4b7739d629cda6729d9beb5: Status 404 returned error can't find the container with id 051ad8ce8bdb8d5f5aae1581a17d82a9fa10e67ef4b7739d629cda6729d9beb5 Jan 21 15:46:48 crc kubenswrapper[4773]: E0121 15:46:48.577746 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Jan 21 15:46:48 crc kubenswrapper[4773]: E0121 15:46:48.577820 4773 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Jan 21 15:46:48 crc kubenswrapper[4773]: E0121 15:46:48.577964 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hrlkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-77kxr_openstack(de6a84b1-1846-4dd0-be7f-47a8872227ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:46:48 crc kubenswrapper[4773]: E0121 15:46:48.579135 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-77kxr" podUID="de6a84b1-1846-4dd0-be7f-47a8872227ff" Jan 21 15:46:48 crc kubenswrapper[4773]: I0121 15:46:48.864584 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7vcf9" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.056176 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-db-sync-config-data\") pod \"3ee2313d-678e-487c-a4af-ae303d40bedd\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.056485 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-combined-ca-bundle\") pod \"3ee2313d-678e-487c-a4af-ae303d40bedd\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.056590 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-config-data\") pod \"3ee2313d-678e-487c-a4af-ae303d40bedd\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.056621 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2ql8\" (UniqueName: \"kubernetes.io/projected/3ee2313d-678e-487c-a4af-ae303d40bedd-kube-api-access-v2ql8\") pod \"3ee2313d-678e-487c-a4af-ae303d40bedd\" (UID: \"3ee2313d-678e-487c-a4af-ae303d40bedd\") " Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.060448 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3ee2313d-678e-487c-a4af-ae303d40bedd" (UID: "3ee2313d-678e-487c-a4af-ae303d40bedd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.062072 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee2313d-678e-487c-a4af-ae303d40bedd-kube-api-access-v2ql8" (OuterVolumeSpecName: "kube-api-access-v2ql8") pod "3ee2313d-678e-487c-a4af-ae303d40bedd" (UID: "3ee2313d-678e-487c-a4af-ae303d40bedd"). InnerVolumeSpecName "kube-api-access-v2ql8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.080578 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ee2313d-678e-487c-a4af-ae303d40bedd" (UID: "3ee2313d-678e-487c-a4af-ae303d40bedd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.104869 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-config-data" (OuterVolumeSpecName: "config-data") pod "3ee2313d-678e-487c-a4af-ae303d40bedd" (UID: "3ee2313d-678e-487c-a4af-ae303d40bedd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.159248 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.159295 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2ql8\" (UniqueName: \"kubernetes.io/projected/3ee2313d-678e-487c-a4af-ae303d40bedd-kube-api-access-v2ql8\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.159312 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.159323 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee2313d-678e-487c-a4af-ae303d40bedd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.331664 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6x5cx" event={"ID":"3eae6f1f-bc67-4acc-836b-68396e478669","Type":"ContainerStarted","Data":"7d86cea0536d0c13baaabd5485b8eaf52a9043d6004981b6d7a42e2aeaa3129d"} Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.331742 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6x5cx" event={"ID":"3eae6f1f-bc67-4acc-836b-68396e478669","Type":"ContainerStarted","Data":"051ad8ce8bdb8d5f5aae1581a17d82a9fa10e67ef4b7739d629cda6729d9beb5"} Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.337799 4773 generic.go:334] "Generic (PLEG): container finished" podID="f88d424e-1f05-4033-95a2-fff161616fa1" containerID="c88e906fd599d1127b95fa47a2a6334a9ee19bdcf5e06009f648515a6e4d00a3" exitCode=0 Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.337856 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zvwk6" event={"ID":"f88d424e-1f05-4033-95a2-fff161616fa1","Type":"ContainerDied","Data":"c88e906fd599d1127b95fa47a2a6334a9ee19bdcf5e06009f648515a6e4d00a3"} Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.337893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zvwk6" event={"ID":"f88d424e-1f05-4033-95a2-fff161616fa1","Type":"ContainerStarted","Data":"c2cba6e78e1978b5a01c3a3dc5fc197b21d2f1b2cd9e81cb8231ceb77af4f7d5"} Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.339714 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51eaf63a-c7b4-47eb-8357-e746bd703b64","Type":"ContainerStarted","Data":"ae21538c3d66deff4e80c8fcb1b7b626228d77fa95d07860087df285515858ce"} Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.342302 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6bhfp" event={"ID":"b12aa4f9-2fe2-4bfd-b764-3755131eb10a","Type":"ContainerStarted","Data":"dd707d39e543364bfdf48d96c129e34d8eda0f48a81578d43c56fa9b9127e4e7"} Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.346209 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7vcf9" event={"ID":"3ee2313d-678e-487c-a4af-ae303d40bedd","Type":"ContainerDied","Data":"806097a4d438ac642a784a61d396a7b7af89579fbbbc6fac52df15ebfce3ae2b"} Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.346260 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="806097a4d438ac642a784a61d396a7b7af89579fbbbc6fac52df15ebfce3ae2b" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.346222 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7vcf9" Jan 21 15:46:49 crc kubenswrapper[4773]: E0121 15:46:49.347539 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-77kxr" podUID="de6a84b1-1846-4dd0-be7f-47a8872227ff" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.361157 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6x5cx" podStartSLOduration=33.361092628 podStartE2EDuration="33.361092628s" podCreationTimestamp="2026-01-21 15:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:49.351141671 +0000 UTC m=+1374.275631303" watchObservedRunningTime="2026-01-21 15:46:49.361092628 +0000 UTC m=+1374.285582260" Jan 21 15:46:49 crc kubenswrapper[4773]: I0121 15:46:49.391219 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-6bhfp" podStartSLOduration=3.681242948 podStartE2EDuration="48.391200247s" podCreationTimestamp="2026-01-21 15:46:01 +0000 UTC" firstStartedPulling="2026-01-21 15:46:03.87700821 +0000 UTC m=+1328.801497842" lastFinishedPulling="2026-01-21 15:46:48.586965519 +0000 UTC m=+1373.511455141" observedRunningTime="2026-01-21 15:46:49.380372626 +0000 UTC m=+1374.304862238" watchObservedRunningTime="2026-01-21 15:46:49.391200247 +0000 UTC m=+1374.315689869" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.271937 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rw9jg"] Jan 21 15:46:50 crc kubenswrapper[4773]: E0121 15:46:50.274539 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee2313d-678e-487c-a4af-ae303d40bedd" containerName="glance-db-sync" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.274639 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee2313d-678e-487c-a4af-ae303d40bedd" containerName="glance-db-sync" Jan 21 15:46:50 crc kubenswrapper[4773]: E0121 15:46:50.274714 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" containerName="dnsmasq-dns" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.274765 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" containerName="dnsmasq-dns" Jan 21 15:46:50 crc kubenswrapper[4773]: E0121 15:46:50.274825 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" containerName="init" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.277042 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" containerName="init" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.277463 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e48bc43f-55bc-4b6a-bc8a-dac53e6549cd" containerName="dnsmasq-dns" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.277561 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee2313d-678e-487c-a4af-ae303d40bedd" containerName="glance-db-sync" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.282386 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.316383 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rw9jg"] Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.389864 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.389986 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.390044 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.390085 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lblt\" (UniqueName: \"kubernetes.io/projected/819f84c9-b51d-4f3c-81c2-c39af2110563-kube-api-access-4lblt\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.402247 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.402362 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-config\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.503833 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lblt\" (UniqueName: \"kubernetes.io/projected/819f84c9-b51d-4f3c-81c2-c39af2110563-kube-api-access-4lblt\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.503950 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.504018 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-config\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.504059 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.504137 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.504210 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.505174 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.507682 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.508738 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-config\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.509275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.509986 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.535155 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lblt\" (UniqueName: \"kubernetes.io/projected/819f84c9-b51d-4f3c-81c2-c39af2110563-kube-api-access-4lblt\") pod \"dnsmasq-dns-785d8bcb8c-rw9jg\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:50 crc kubenswrapper[4773]: I0121 15:46:50.641129 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.172142 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.174058 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.176501 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-stvz5" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.177178 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.177237 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.189398 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.320687 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.320756 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.320789 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-logs\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.320833 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.320860 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.320883 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zkvt\" (UniqueName: \"kubernetes.io/projected/fc612e0c-df99-4189-96eb-ab85d3c20688-kube-api-access-4zkvt\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.321002 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.408338 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.411125 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.415085 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.415754 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.422401 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.422533 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.422554 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.422573 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-logs\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.422604 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.422626 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.422644 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zkvt\" (UniqueName: \"kubernetes.io/projected/fc612e0c-df99-4189-96eb-ab85d3c20688-kube-api-access-4zkvt\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.423933 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-logs\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.424643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.435865 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.436295 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e6ddc15da45c425afe1609cfb31bc40a4ae0f7e1b60627fb6c6d646b1880744e/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.443172 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.450514 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zkvt\" (UniqueName: \"kubernetes.io/projected/fc612e0c-df99-4189-96eb-ab85d3c20688-kube-api-access-4zkvt\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.451160 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.453620 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.518784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.524245 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.524293 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.524342 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4b7c\" (UniqueName: \"kubernetes.io/projected/2f358a9d-1083-4576-96ce-f28761167281-kube-api-access-j4b7c\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.524373 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-logs\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.524439 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.524476 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.524544 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.625889 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.625939 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.625985 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4b7c\" (UniqueName: \"kubernetes.io/projected/2f358a9d-1083-4576-96ce-f28761167281-kube-api-access-j4b7c\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.626014 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-logs\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.626082 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.626112 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.626161 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.627139 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-logs\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.627479 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.630184 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.630468 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.630953 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.632402 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.632430 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/149dd6dfda276adff7f1f12e0c1d439e14e49afb630f1d98c3833562ebaefedd/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.644065 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4b7c\" (UniqueName: \"kubernetes.io/projected/2f358a9d-1083-4576-96ce-f28761167281-kube-api-access-j4b7c\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.672765 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.801944 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.803141 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zvwk6" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.853336 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.933836 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7bfq\" (UniqueName: \"kubernetes.io/projected/f88d424e-1f05-4033-95a2-fff161616fa1-kube-api-access-c7bfq\") pod \"f88d424e-1f05-4033-95a2-fff161616fa1\" (UID: \"f88d424e-1f05-4033-95a2-fff161616fa1\") " Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.934186 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f88d424e-1f05-4033-95a2-fff161616fa1-operator-scripts\") pod \"f88d424e-1f05-4033-95a2-fff161616fa1\" (UID: \"f88d424e-1f05-4033-95a2-fff161616fa1\") " Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.935132 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f88d424e-1f05-4033-95a2-fff161616fa1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f88d424e-1f05-4033-95a2-fff161616fa1" (UID: "f88d424e-1f05-4033-95a2-fff161616fa1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:46:51 crc kubenswrapper[4773]: I0121 15:46:51.942027 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88d424e-1f05-4033-95a2-fff161616fa1-kube-api-access-c7bfq" (OuterVolumeSpecName: "kube-api-access-c7bfq") pod "f88d424e-1f05-4033-95a2-fff161616fa1" (UID: "f88d424e-1f05-4033-95a2-fff161616fa1"). InnerVolumeSpecName "kube-api-access-c7bfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.036585 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7bfq\" (UniqueName: \"kubernetes.io/projected/f88d424e-1f05-4033-95a2-fff161616fa1-kube-api-access-c7bfq\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.036625 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f88d424e-1f05-4033-95a2-fff161616fa1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.082033 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rw9jg"] Jan 21 15:46:52 crc kubenswrapper[4773]: W0121 15:46:52.092936 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod819f84c9_b51d_4f3c_81c2_c39af2110563.slice/crio-e44941720147ed4b6ea07e2b4646fb1874f44b15c568973f6eabaac68397c921 WatchSource:0}: Error finding container e44941720147ed4b6ea07e2b4646fb1874f44b15c568973f6eabaac68397c921: Status 404 returned error can't find the container with id e44941720147ed4b6ea07e2b4646fb1874f44b15c568973f6eabaac68397c921 Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.382405 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zvwk6" event={"ID":"f88d424e-1f05-4033-95a2-fff161616fa1","Type":"ContainerDied","Data":"c2cba6e78e1978b5a01c3a3dc5fc197b21d2f1b2cd9e81cb8231ceb77af4f7d5"} Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.382804 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2cba6e78e1978b5a01c3a3dc5fc197b21d2f1b2cd9e81cb8231ceb77af4f7d5" Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.382418 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zvwk6" Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.385522 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51eaf63a-c7b4-47eb-8357-e746bd703b64","Type":"ContainerStarted","Data":"1981941504f849effb65da2d5b25a3c47a6a97bbb7d5abeba9d270e23e5816ab"} Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.393876 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2bxp" event={"ID":"3cd3f2b8-c365-4845-8508-2403d3b1f03e","Type":"ContainerStarted","Data":"c67b185dffa20fe1a4a76944ac60a9359ff42fdac69e7b491c802b48b2b8a22d"} Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.406238 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" event={"ID":"819f84c9-b51d-4f3c-81c2-c39af2110563","Type":"ContainerStarted","Data":"e44941720147ed4b6ea07e2b4646fb1874f44b15c568973f6eabaac68397c921"} Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.422433 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-n2bxp" podStartSLOduration=4.644711821 podStartE2EDuration="50.422412987s" podCreationTimestamp="2026-01-21 15:46:02 +0000 UTC" firstStartedPulling="2026-01-21 15:46:04.013232372 +0000 UTC m=+1328.937721994" lastFinishedPulling="2026-01-21 15:46:49.790933538 +0000 UTC m=+1374.715423160" observedRunningTime="2026-01-21 15:46:52.413519498 +0000 UTC m=+1377.338009130" watchObservedRunningTime="2026-01-21 15:46:52.422412987 +0000 UTC m=+1377.346902609" Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.514418 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.615400 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:46:52 crc kubenswrapper[4773]: I0121 15:46:52.994835 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zvwk6"] Jan 21 15:46:53 crc kubenswrapper[4773]: I0121 15:46:53.002344 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zvwk6"] Jan 21 15:46:53 crc kubenswrapper[4773]: I0121 15:46:53.399251 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88d424e-1f05-4033-95a2-fff161616fa1" path="/var/lib/kubelet/pods/f88d424e-1f05-4033-95a2-fff161616fa1/volumes" Jan 21 15:46:53 crc kubenswrapper[4773]: I0121 15:46:53.422050 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2f358a9d-1083-4576-96ce-f28761167281","Type":"ContainerStarted","Data":"4cdd14843b40361b31fe5948f346896cc99c0ca8b4d9af4ef8e411d07709e828"} Jan 21 15:46:53 crc kubenswrapper[4773]: I0121 15:46:53.430066 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc612e0c-df99-4189-96eb-ab85d3c20688","Type":"ContainerStarted","Data":"361b0f10201416e8f90a0b5f6ddc971af83b0b4eac26e4f0e3cdcdf505d334b1"} Jan 21 15:46:53 crc kubenswrapper[4773]: I0121 15:46:53.433970 4773 generic.go:334] "Generic (PLEG): container finished" podID="819f84c9-b51d-4f3c-81c2-c39af2110563" containerID="8a441b89268ead55e85d6f0400d6038b9a06a10e3aaa5320ff5beb438f5ae18f" exitCode=0 Jan 21 15:46:53 crc kubenswrapper[4773]: I0121 15:46:53.434018 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" event={"ID":"819f84c9-b51d-4f3c-81c2-c39af2110563","Type":"ContainerDied","Data":"8a441b89268ead55e85d6f0400d6038b9a06a10e3aaa5320ff5beb438f5ae18f"} Jan 21 15:46:53 crc kubenswrapper[4773]: I0121 15:46:53.461358 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:46:53 crc kubenswrapper[4773]: I0121 15:46:53.602440 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:46:54 crc kubenswrapper[4773]: I0121 15:46:54.454833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2f358a9d-1083-4576-96ce-f28761167281","Type":"ContainerStarted","Data":"3838bb8011c283bbb4fc3c55d05f49960cdce119c8bdb21d5c491bb8e3927ba2"} Jan 21 15:46:54 crc kubenswrapper[4773]: I0121 15:46:54.462753 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc612e0c-df99-4189-96eb-ab85d3c20688","Type":"ContainerStarted","Data":"f69df6d0719fa4a66cc816f3d5eddd9062642cfcc58665312fd7ed662bb9db07"} Jan 21 15:46:54 crc kubenswrapper[4773]: I0121 15:46:54.469039 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" event={"ID":"819f84c9-b51d-4f3c-81c2-c39af2110563","Type":"ContainerStarted","Data":"8d344507933e713208a41949c0305bb539dcf31453ebaf04d167b14b07955173"} Jan 21 15:46:54 crc kubenswrapper[4773]: I0121 15:46:54.470038 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:46:54 crc kubenswrapper[4773]: I0121 15:46:54.502284 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" podStartSLOduration=4.502266051 podStartE2EDuration="4.502266051s" podCreationTimestamp="2026-01-21 15:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:54.492129108 +0000 UTC m=+1379.416618730" watchObservedRunningTime="2026-01-21 15:46:54.502266051 +0000 UTC m=+1379.426755673" Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.206497 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.206572 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.495556 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc612e0c-df99-4189-96eb-ab85d3c20688","Type":"ContainerStarted","Data":"13c5a6b56405fe73ba002bbbd2f97415535a72bb99eeb4ff7abe02d50d331d0c"} Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.496189 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fc612e0c-df99-4189-96eb-ab85d3c20688" containerName="glance-log" containerID="cri-o://f69df6d0719fa4a66cc816f3d5eddd9062642cfcc58665312fd7ed662bb9db07" gracePeriod=30 Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.496328 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fc612e0c-df99-4189-96eb-ab85d3c20688" containerName="glance-httpd" containerID="cri-o://13c5a6b56405fe73ba002bbbd2f97415535a72bb99eeb4ff7abe02d50d331d0c" gracePeriod=30 Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.501971 4773 generic.go:334] "Generic (PLEG): container finished" podID="3eae6f1f-bc67-4acc-836b-68396e478669" containerID="7d86cea0536d0c13baaabd5485b8eaf52a9043d6004981b6d7a42e2aeaa3129d" exitCode=0 Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.502072 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6x5cx" event={"ID":"3eae6f1f-bc67-4acc-836b-68396e478669","Type":"ContainerDied","Data":"7d86cea0536d0c13baaabd5485b8eaf52a9043d6004981b6d7a42e2aeaa3129d"} Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.507024 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2f358a9d-1083-4576-96ce-f28761167281","Type":"ContainerStarted","Data":"ad195c79dc3ec9adc45da6c2f92c13ae49494b8ad1680099f0f85e3acbdc06fa"} Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.507030 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2f358a9d-1083-4576-96ce-f28761167281" containerName="glance-log" containerID="cri-o://3838bb8011c283bbb4fc3c55d05f49960cdce119c8bdb21d5c491bb8e3927ba2" gracePeriod=30 Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.507211 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2f358a9d-1083-4576-96ce-f28761167281" containerName="glance-httpd" containerID="cri-o://ad195c79dc3ec9adc45da6c2f92c13ae49494b8ad1680099f0f85e3acbdc06fa" gracePeriod=30 Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.527902 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.527884813 podStartE2EDuration="5.527884813s" podCreationTimestamp="2026-01-21 15:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:55.525856909 +0000 UTC m=+1380.450346531" watchObservedRunningTime="2026-01-21 15:46:55.527884813 +0000 UTC m=+1380.452374435" Jan 21 15:46:55 crc kubenswrapper[4773]: I0121 15:46:55.592357 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.592334106 podStartE2EDuration="5.592334106s" podCreationTimestamp="2026-01-21 15:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:46:55.576778968 +0000 UTC m=+1380.501268600" watchObservedRunningTime="2026-01-21 15:46:55.592334106 +0000 UTC m=+1380.516823738" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.529556 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wnbml"] Jan 21 15:46:56 crc kubenswrapper[4773]: E0121 15:46:56.533552 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f88d424e-1f05-4033-95a2-fff161616fa1" containerName="mariadb-account-create-update" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.533583 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f88d424e-1f05-4033-95a2-fff161616fa1" containerName="mariadb-account-create-update" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.533859 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f88d424e-1f05-4033-95a2-fff161616fa1" containerName="mariadb-account-create-update" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.534763 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wnbml" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.537011 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.556849 4773 generic.go:334] "Generic (PLEG): container finished" podID="2f358a9d-1083-4576-96ce-f28761167281" containerID="ad195c79dc3ec9adc45da6c2f92c13ae49494b8ad1680099f0f85e3acbdc06fa" exitCode=0 Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.556880 4773 generic.go:334] "Generic (PLEG): container finished" podID="2f358a9d-1083-4576-96ce-f28761167281" containerID="3838bb8011c283bbb4fc3c55d05f49960cdce119c8bdb21d5c491bb8e3927ba2" exitCode=143 Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.556935 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2f358a9d-1083-4576-96ce-f28761167281","Type":"ContainerDied","Data":"ad195c79dc3ec9adc45da6c2f92c13ae49494b8ad1680099f0f85e3acbdc06fa"} Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.556971 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2f358a9d-1083-4576-96ce-f28761167281","Type":"ContainerDied","Data":"3838bb8011c283bbb4fc3c55d05f49960cdce119c8bdb21d5c491bb8e3927ba2"} Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.571075 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wnbml"] Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.575633 4773 generic.go:334] "Generic (PLEG): container finished" podID="fc612e0c-df99-4189-96eb-ab85d3c20688" containerID="13c5a6b56405fe73ba002bbbd2f97415535a72bb99eeb4ff7abe02d50d331d0c" exitCode=0 Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.575669 4773 generic.go:334] "Generic (PLEG): container finished" podID="fc612e0c-df99-4189-96eb-ab85d3c20688" containerID="f69df6d0719fa4a66cc816f3d5eddd9062642cfcc58665312fd7ed662bb9db07" exitCode=143 Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.575880 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc612e0c-df99-4189-96eb-ab85d3c20688","Type":"ContainerDied","Data":"13c5a6b56405fe73ba002bbbd2f97415535a72bb99eeb4ff7abe02d50d331d0c"} Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.575914 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc612e0c-df99-4189-96eb-ab85d3c20688","Type":"ContainerDied","Data":"f69df6d0719fa4a66cc816f3d5eddd9062642cfcc58665312fd7ed662bb9db07"} Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.670996 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwr6z\" (UniqueName: \"kubernetes.io/projected/e15b4c36-657d-47af-a103-b268a9727430-kube-api-access-bwr6z\") pod \"root-account-create-update-wnbml\" (UID: \"e15b4c36-657d-47af-a103-b268a9727430\") " pod="openstack/root-account-create-update-wnbml" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.671172 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e15b4c36-657d-47af-a103-b268a9727430-operator-scripts\") pod \"root-account-create-update-wnbml\" (UID: \"e15b4c36-657d-47af-a103-b268a9727430\") " pod="openstack/root-account-create-update-wnbml" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.772995 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e15b4c36-657d-47af-a103-b268a9727430-operator-scripts\") pod \"root-account-create-update-wnbml\" (UID: \"e15b4c36-657d-47af-a103-b268a9727430\") " pod="openstack/root-account-create-update-wnbml" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.773427 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwr6z\" (UniqueName: \"kubernetes.io/projected/e15b4c36-657d-47af-a103-b268a9727430-kube-api-access-bwr6z\") pod \"root-account-create-update-wnbml\" (UID: \"e15b4c36-657d-47af-a103-b268a9727430\") " pod="openstack/root-account-create-update-wnbml" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.773970 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e15b4c36-657d-47af-a103-b268a9727430-operator-scripts\") pod \"root-account-create-update-wnbml\" (UID: \"e15b4c36-657d-47af-a103-b268a9727430\") " pod="openstack/root-account-create-update-wnbml" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.801436 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwr6z\" (UniqueName: \"kubernetes.io/projected/e15b4c36-657d-47af-a103-b268a9727430-kube-api-access-bwr6z\") pod \"root-account-create-update-wnbml\" (UID: \"e15b4c36-657d-47af-a103-b268a9727430\") " pod="openstack/root-account-create-update-wnbml" Jan 21 15:46:56 crc kubenswrapper[4773]: I0121 15:46:56.865714 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wnbml" Jan 21 15:46:57 crc kubenswrapper[4773]: I0121 15:46:57.598758 4773 generic.go:334] "Generic (PLEG): container finished" podID="b12aa4f9-2fe2-4bfd-b764-3755131eb10a" containerID="dd707d39e543364bfdf48d96c129e34d8eda0f48a81578d43c56fa9b9127e4e7" exitCode=0 Jan 21 15:46:57 crc kubenswrapper[4773]: I0121 15:46:57.598923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6bhfp" event={"ID":"b12aa4f9-2fe2-4bfd-b764-3755131eb10a","Type":"ContainerDied","Data":"dd707d39e543364bfdf48d96c129e34d8eda0f48a81578d43c56fa9b9127e4e7"} Jan 21 15:46:57 crc kubenswrapper[4773]: I0121 15:46:57.982957 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.098443 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-logs\") pod \"2f358a9d-1083-4576-96ce-f28761167281\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.098529 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4b7c\" (UniqueName: \"kubernetes.io/projected/2f358a9d-1083-4576-96ce-f28761167281-kube-api-access-j4b7c\") pod \"2f358a9d-1083-4576-96ce-f28761167281\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.098579 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-httpd-run\") pod \"2f358a9d-1083-4576-96ce-f28761167281\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.099303 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-combined-ca-bundle\") pod \"2f358a9d-1083-4576-96ce-f28761167281\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.100096 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-logs" (OuterVolumeSpecName: "logs") pod "2f358a9d-1083-4576-96ce-f28761167281" (UID: "2f358a9d-1083-4576-96ce-f28761167281"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.100215 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"2f358a9d-1083-4576-96ce-f28761167281\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.100230 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2f358a9d-1083-4576-96ce-f28761167281" (UID: "2f358a9d-1083-4576-96ce-f28761167281"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.100356 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-config-data\") pod \"2f358a9d-1083-4576-96ce-f28761167281\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.100509 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-scripts\") pod \"2f358a9d-1083-4576-96ce-f28761167281\" (UID: \"2f358a9d-1083-4576-96ce-f28761167281\") " Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.102482 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.102501 4773 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f358a9d-1083-4576-96ce-f28761167281-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.114275 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-scripts" (OuterVolumeSpecName: "scripts") pod "2f358a9d-1083-4576-96ce-f28761167281" (UID: "2f358a9d-1083-4576-96ce-f28761167281"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.135447 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f358a9d-1083-4576-96ce-f28761167281" (UID: "2f358a9d-1083-4576-96ce-f28761167281"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.135892 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f358a9d-1083-4576-96ce-f28761167281-kube-api-access-j4b7c" (OuterVolumeSpecName: "kube-api-access-j4b7c") pod "2f358a9d-1083-4576-96ce-f28761167281" (UID: "2f358a9d-1083-4576-96ce-f28761167281"). InnerVolumeSpecName "kube-api-access-j4b7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.137571 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151" (OuterVolumeSpecName: "glance") pod "2f358a9d-1083-4576-96ce-f28761167281" (UID: "2f358a9d-1083-4576-96ce-f28761167281"). InnerVolumeSpecName "pvc-5fed4dce-f776-48ab-b524-c032df929151". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.205118 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") on node \"crc\" " Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.205176 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.205191 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4b7c\" (UniqueName: \"kubernetes.io/projected/2f358a9d-1083-4576-96ce-f28761167281-kube-api-access-j4b7c\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.205218 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.221028 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-config-data" (OuterVolumeSpecName: "config-data") pod "2f358a9d-1083-4576-96ce-f28761167281" (UID: "2f358a9d-1083-4576-96ce-f28761167281"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.236685 4773 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.236851 4773 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5fed4dce-f776-48ab-b524-c032df929151" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151") on node "crc" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.307728 4773 reconciler_common.go:293] "Volume detached for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.307770 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f358a9d-1083-4576-96ce-f28761167281-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.610839 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2f358a9d-1083-4576-96ce-f28761167281","Type":"ContainerDied","Data":"4cdd14843b40361b31fe5948f346896cc99c0ca8b4d9af4ef8e411d07709e828"} Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.611945 4773 scope.go:117] "RemoveContainer" containerID="ad195c79dc3ec9adc45da6c2f92c13ae49494b8ad1680099f0f85e3acbdc06fa" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.612269 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.671459 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.682288 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.697862 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:46:58 crc kubenswrapper[4773]: E0121 15:46:58.698457 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f358a9d-1083-4576-96ce-f28761167281" containerName="glance-log" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.698482 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f358a9d-1083-4576-96ce-f28761167281" containerName="glance-log" Jan 21 15:46:58 crc kubenswrapper[4773]: E0121 15:46:58.698502 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f358a9d-1083-4576-96ce-f28761167281" containerName="glance-httpd" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.698511 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f358a9d-1083-4576-96ce-f28761167281" containerName="glance-httpd" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.698815 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f358a9d-1083-4576-96ce-f28761167281" containerName="glance-httpd" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.698845 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f358a9d-1083-4576-96ce-f28761167281" containerName="glance-log" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.699955 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.705553 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.705798 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.725035 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.816451 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.816522 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.816579 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.816653 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.816674 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx8s5\" (UniqueName: \"kubernetes.io/projected/8cf09d70-1803-460f-ba8d-0434313796cf-kube-api-access-sx8s5\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.816712 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.816729 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.816747 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.922547 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.922907 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx8s5\" (UniqueName: \"kubernetes.io/projected/8cf09d70-1803-460f-ba8d-0434313796cf-kube-api-access-sx8s5\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.922930 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.922946 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.922960 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.923007 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.923048 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.923100 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.938532 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.954799 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.954858 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/149dd6dfda276adff7f1f12e0c1d439e14e49afb630f1d98c3833562ebaefedd/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.969956 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.970680 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.970747 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.972004 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.973406 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:58 crc kubenswrapper[4773]: I0121 15:46:58.973813 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx8s5\" (UniqueName: \"kubernetes.io/projected/8cf09d70-1803-460f-ba8d-0434313796cf-kube-api-access-sx8s5\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:59 crc kubenswrapper[4773]: I0121 15:46:59.039181 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:46:59 crc kubenswrapper[4773]: I0121 15:46:59.323645 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 15:46:59 crc kubenswrapper[4773]: I0121 15:46:59.398479 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f358a9d-1083-4576-96ce-f28761167281" path="/var/lib/kubelet/pods/2f358a9d-1083-4576-96ce-f28761167281/volumes" Jan 21 15:47:00 crc kubenswrapper[4773]: I0121 15:47:00.643353 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:47:00 crc kubenswrapper[4773]: I0121 15:47:00.732262 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ssd99"] Jan 21 15:47:00 crc kubenswrapper[4773]: I0121 15:47:00.732556 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" podUID="2daf1511-7c23-47cc-900d-07ed88b573c3" containerName="dnsmasq-dns" containerID="cri-o://4e8e03a30ace383bb1100d158f9fad186856f66547281aa119f4cac36eff50e8" gracePeriod=10 Jan 21 15:47:01 crc kubenswrapper[4773]: I0121 15:47:01.641528 4773 generic.go:334] "Generic (PLEG): container finished" podID="2daf1511-7c23-47cc-900d-07ed88b573c3" containerID="4e8e03a30ace383bb1100d158f9fad186856f66547281aa119f4cac36eff50e8" exitCode=0 Jan 21 15:47:01 crc kubenswrapper[4773]: I0121 15:47:01.641604 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" event={"ID":"2daf1511-7c23-47cc-900d-07ed88b573c3","Type":"ContainerDied","Data":"4e8e03a30ace383bb1100d158f9fad186856f66547281aa119f4cac36eff50e8"} Jan 21 15:47:01 crc kubenswrapper[4773]: I0121 15:47:01.643617 4773 generic.go:334] "Generic (PLEG): container finished" podID="f3a44a95-1489-4fc8-8cbc-8f82568dcdfd" containerID="5d733e8c68f4ba219ff27620276d938a782447f1b4e38ce0f63ddc39f8b58720" exitCode=0 Jan 21 15:47:01 crc kubenswrapper[4773]: I0121 15:47:01.643648 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-l79d2" event={"ID":"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd","Type":"ContainerDied","Data":"5d733e8c68f4ba219ff27620276d938a782447f1b4e38ce0f63ddc39f8b58720"} Jan 21 15:47:02 crc kubenswrapper[4773]: I0121 15:47:02.649688 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" podUID="2daf1511-7c23-47cc-900d-07ed88b573c3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.161:5353: connect: connection refused" Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.687359 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-l79d2" event={"ID":"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd","Type":"ContainerDied","Data":"68fe6b76b8445a66ffdce0e01adc48929d8fa5180ccac9fa976e47ffc76b2336"} Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.687931 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68fe6b76b8445a66ffdce0e01adc48929d8fa5180ccac9fa976e47ffc76b2336" Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.689587 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6bhfp" event={"ID":"b12aa4f9-2fe2-4bfd-b764-3755131eb10a","Type":"ContainerDied","Data":"10e77be285db4fee662b084f3144e6e738f12f672d39f74b5a0d231562e848fe"} Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.689630 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e77be285db4fee662b084f3144e6e738f12f672d39f74b5a0d231562e848fe" Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.691326 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6x5cx" event={"ID":"3eae6f1f-bc67-4acc-836b-68396e478669","Type":"ContainerDied","Data":"051ad8ce8bdb8d5f5aae1581a17d82a9fa10e67ef4b7739d629cda6729d9beb5"} Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.691356 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="051ad8ce8bdb8d5f5aae1581a17d82a9fa10e67ef4b7739d629cda6729d9beb5" Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.693515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc612e0c-df99-4189-96eb-ab85d3c20688","Type":"ContainerDied","Data":"361b0f10201416e8f90a0b5f6ddc971af83b0b4eac26e4f0e3cdcdf505d334b1"} Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.693550 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="361b0f10201416e8f90a0b5f6ddc971af83b0b4eac26e4f0e3cdcdf505d334b1" Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.713926 4773 scope.go:117] "RemoveContainer" containerID="3838bb8011c283bbb4fc3c55d05f49960cdce119c8bdb21d5c491bb8e3927ba2" Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.787992 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-l79d2" Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.919272 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t484m\" (UniqueName: \"kubernetes.io/projected/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-kube-api-access-t484m\") pod \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.919723 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-combined-ca-bundle\") pod \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.919787 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-config\") pod \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\" (UID: \"f3a44a95-1489-4fc8-8cbc-8f82568dcdfd\") " Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.980799 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-kube-api-access-t484m" (OuterVolumeSpecName: "kube-api-access-t484m") pod "f3a44a95-1489-4fc8-8cbc-8f82568dcdfd" (UID: "f3a44a95-1489-4fc8-8cbc-8f82568dcdfd"). InnerVolumeSpecName "kube-api-access-t484m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:06 crc kubenswrapper[4773]: I0121 15:47:06.997613 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3a44a95-1489-4fc8-8cbc-8f82568dcdfd" (UID: "f3a44a95-1489-4fc8-8cbc-8f82568dcdfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.026913 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t484m\" (UniqueName: \"kubernetes.io/projected/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-kube-api-access-t484m\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.026948 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.042078 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.043141 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.044997 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-config" (OuterVolumeSpecName: "config") pod "f3a44a95-1489-4fc8-8cbc-8f82568dcdfd" (UID: "f3a44a95-1489-4fc8-8cbc-8f82568dcdfd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.077019 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6bhfp" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.128895 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.226200 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.229874 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-httpd-run\") pod \"fc612e0c-df99-4189-96eb-ab85d3c20688\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.229919 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zkvt\" (UniqueName: \"kubernetes.io/projected/fc612e0c-df99-4189-96eb-ab85d3c20688-kube-api-access-4zkvt\") pod \"fc612e0c-df99-4189-96eb-ab85d3c20688\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.229986 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-scripts\") pod \"fc612e0c-df99-4189-96eb-ab85d3c20688\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230084 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-scripts\") pod \"3eae6f1f-bc67-4acc-836b-68396e478669\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230281 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"fc612e0c-df99-4189-96eb-ab85d3c20688\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230340 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-config-data\") pod \"fc612e0c-df99-4189-96eb-ab85d3c20688\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230380 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-combined-ca-bundle\") pod \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230441 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-config-data\") pod \"3eae6f1f-bc67-4acc-836b-68396e478669\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230474 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-combined-ca-bundle\") pod \"fc612e0c-df99-4189-96eb-ab85d3c20688\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230524 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fc612e0c-df99-4189-96eb-ab85d3c20688" (UID: "fc612e0c-df99-4189-96eb-ab85d3c20688"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230544 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-credential-keys\") pod \"3eae6f1f-bc67-4acc-836b-68396e478669\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230573 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-fernet-keys\") pod \"3eae6f1f-bc67-4acc-836b-68396e478669\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230638 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-combined-ca-bundle\") pod \"3eae6f1f-bc67-4acc-836b-68396e478669\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230665 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrqgw\" (UniqueName: \"kubernetes.io/projected/3eae6f1f-bc67-4acc-836b-68396e478669-kube-api-access-jrqgw\") pod \"3eae6f1f-bc67-4acc-836b-68396e478669\" (UID: \"3eae6f1f-bc67-4acc-836b-68396e478669\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230765 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-logs\") pod \"fc612e0c-df99-4189-96eb-ab85d3c20688\" (UID: \"fc612e0c-df99-4189-96eb-ab85d3c20688\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230817 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxjsg\" (UniqueName: \"kubernetes.io/projected/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-kube-api-access-gxjsg\") pod \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230849 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-config-data\") pod \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230888 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-scripts\") pod \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.230918 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-logs\") pod \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\" (UID: \"b12aa4f9-2fe2-4bfd-b764-3755131eb10a\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.231441 4773 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.232828 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-logs" (OuterVolumeSpecName: "logs") pod "fc612e0c-df99-4189-96eb-ab85d3c20688" (UID: "fc612e0c-df99-4189-96eb-ab85d3c20688"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.237610 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3eae6f1f-bc67-4acc-836b-68396e478669" (UID: "3eae6f1f-bc67-4acc-836b-68396e478669"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.241915 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc612e0c-df99-4189-96eb-ab85d3c20688-kube-api-access-4zkvt" (OuterVolumeSpecName: "kube-api-access-4zkvt") pod "fc612e0c-df99-4189-96eb-ab85d3c20688" (UID: "fc612e0c-df99-4189-96eb-ab85d3c20688"). InnerVolumeSpecName "kube-api-access-4zkvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.243149 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-scripts" (OuterVolumeSpecName: "scripts") pod "b12aa4f9-2fe2-4bfd-b764-3755131eb10a" (UID: "b12aa4f9-2fe2-4bfd-b764-3755131eb10a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.243159 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3eae6f1f-bc67-4acc-836b-68396e478669" (UID: "3eae6f1f-bc67-4acc-836b-68396e478669"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.243897 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-logs" (OuterVolumeSpecName: "logs") pod "b12aa4f9-2fe2-4bfd-b764-3755131eb10a" (UID: "b12aa4f9-2fe2-4bfd-b764-3755131eb10a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.244760 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-scripts" (OuterVolumeSpecName: "scripts") pod "3eae6f1f-bc67-4acc-836b-68396e478669" (UID: "3eae6f1f-bc67-4acc-836b-68396e478669"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.245921 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-kube-api-access-gxjsg" (OuterVolumeSpecName: "kube-api-access-gxjsg") pod "b12aa4f9-2fe2-4bfd-b764-3755131eb10a" (UID: "b12aa4f9-2fe2-4bfd-b764-3755131eb10a"). InnerVolumeSpecName "kube-api-access-gxjsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.246127 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-scripts" (OuterVolumeSpecName: "scripts") pod "fc612e0c-df99-4189-96eb-ab85d3c20688" (UID: "fc612e0c-df99-4189-96eb-ab85d3c20688"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.247567 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eae6f1f-bc67-4acc-836b-68396e478669-kube-api-access-jrqgw" (OuterVolumeSpecName: "kube-api-access-jrqgw") pod "3eae6f1f-bc67-4acc-836b-68396e478669" (UID: "3eae6f1f-bc67-4acc-836b-68396e478669"). InnerVolumeSpecName "kube-api-access-jrqgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.295240 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0" (OuterVolumeSpecName: "glance") pod "fc612e0c-df99-4189-96eb-ab85d3c20688" (UID: "fc612e0c-df99-4189-96eb-ab85d3c20688"). InnerVolumeSpecName "pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.316980 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3eae6f1f-bc67-4acc-836b-68396e478669" (UID: "3eae6f1f-bc67-4acc-836b-68396e478669"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.318597 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-config-data" (OuterVolumeSpecName: "config-data") pod "b12aa4f9-2fe2-4bfd-b764-3755131eb10a" (UID: "b12aa4f9-2fe2-4bfd-b764-3755131eb10a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.330508 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-config-data" (OuterVolumeSpecName: "config-data") pod "3eae6f1f-bc67-4acc-836b-68396e478669" (UID: "3eae6f1f-bc67-4acc-836b-68396e478669"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.332458 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxhj2\" (UniqueName: \"kubernetes.io/projected/2daf1511-7c23-47cc-900d-07ed88b573c3-kube-api-access-jxhj2\") pod \"2daf1511-7c23-47cc-900d-07ed88b573c3\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.332766 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-nb\") pod \"2daf1511-7c23-47cc-900d-07ed88b573c3\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.332823 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-sb\") pod \"2daf1511-7c23-47cc-900d-07ed88b573c3\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.332850 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-config\") pod \"2daf1511-7c23-47cc-900d-07ed88b573c3\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.332900 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-swift-storage-0\") pod \"2daf1511-7c23-47cc-900d-07ed88b573c3\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.333056 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-svc\") pod \"2daf1511-7c23-47cc-900d-07ed88b573c3\" (UID: \"2daf1511-7c23-47cc-900d-07ed88b573c3\") " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334511 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334538 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334550 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zkvt\" (UniqueName: \"kubernetes.io/projected/fc612e0c-df99-4189-96eb-ab85d3c20688-kube-api-access-4zkvt\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334564 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334577 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334604 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") on node \"crc\" " Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334618 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334632 4773 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334643 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334666 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eae6f1f-bc67-4acc-836b-68396e478669-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334678 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrqgw\" (UniqueName: \"kubernetes.io/projected/3eae6f1f-bc67-4acc-836b-68396e478669-kube-api-access-jrqgw\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334722 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc612e0c-df99-4189-96eb-ab85d3c20688-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334735 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxjsg\" (UniqueName: \"kubernetes.io/projected/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-kube-api-access-gxjsg\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.334746 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.338012 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b12aa4f9-2fe2-4bfd-b764-3755131eb10a" (UID: "b12aa4f9-2fe2-4bfd-b764-3755131eb10a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.341431 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2daf1511-7c23-47cc-900d-07ed88b573c3-kube-api-access-jxhj2" (OuterVolumeSpecName: "kube-api-access-jxhj2") pod "2daf1511-7c23-47cc-900d-07ed88b573c3" (UID: "2daf1511-7c23-47cc-900d-07ed88b573c3"). InnerVolumeSpecName "kube-api-access-jxhj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.368654 4773 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.369049 4773 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0") on node "crc" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.375314 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-config-data" (OuterVolumeSpecName: "config-data") pod "fc612e0c-df99-4189-96eb-ab85d3c20688" (UID: "fc612e0c-df99-4189-96eb-ab85d3c20688"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.376211 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc612e0c-df99-4189-96eb-ab85d3c20688" (UID: "fc612e0c-df99-4189-96eb-ab85d3c20688"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.424646 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2daf1511-7c23-47cc-900d-07ed88b573c3" (UID: "2daf1511-7c23-47cc-900d-07ed88b573c3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.431600 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-config" (OuterVolumeSpecName: "config") pod "2daf1511-7c23-47cc-900d-07ed88b573c3" (UID: "2daf1511-7c23-47cc-900d-07ed88b573c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.436242 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxhj2\" (UniqueName: \"kubernetes.io/projected/2daf1511-7c23-47cc-900d-07ed88b573c3-kube-api-access-jxhj2\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.436280 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.436296 4773 reconciler_common.go:293] "Volume detached for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.436316 4773 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.436329 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.436340 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12aa4f9-2fe2-4bfd-b764-3755131eb10a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.436349 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc612e0c-df99-4189-96eb-ab85d3c20688-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.456160 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2daf1511-7c23-47cc-900d-07ed88b573c3" (UID: "2daf1511-7c23-47cc-900d-07ed88b573c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.459243 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2daf1511-7c23-47cc-900d-07ed88b573c3" (UID: "2daf1511-7c23-47cc-900d-07ed88b573c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.460940 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2daf1511-7c23-47cc-900d-07ed88b573c3" (UID: "2daf1511-7c23-47cc-900d-07ed88b573c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.538770 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.539323 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.539360 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2daf1511-7c23-47cc-900d-07ed88b573c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:07 crc kubenswrapper[4773]: E0121 15:47:07.616646 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3a44a95_1489_4fc8_8cbc_8f82568dcdfd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eae6f1f_bc67_4acc_836b_68396e478669.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc612e0c_df99_4189_96eb_ab85d3c20688.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb12aa4f9_2fe2_4bfd_b764_3755131eb10a.slice/crio-10e77be285db4fee662b084f3144e6e738f12f672d39f74b5a0d231562e848fe\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3a44a95_1489_4fc8_8cbc_8f82568dcdfd.slice/crio-68fe6b76b8445a66ffdce0e01adc48929d8fa5180ccac9fa976e47ffc76b2336\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc612e0c_df99_4189_96eb_ab85d3c20688.slice/crio-361b0f10201416e8f90a0b5f6ddc971af83b0b4eac26e4f0e3cdcdf505d334b1\": RecentStats: unable to find data in memory cache]" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.716648 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wnbml"] Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.762895 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.763080 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-ssd99" event={"ID":"2daf1511-7c23-47cc-900d-07ed88b573c3","Type":"ContainerDied","Data":"ea4d74e71ead718925525104cae50f5fe9b2cea5f266e4bf324c9b4e37c39803"} Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.763303 4773 scope.go:117] "RemoveContainer" containerID="4e8e03a30ace383bb1100d158f9fad186856f66547281aa119f4cac36eff50e8" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.772028 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-77kxr" event={"ID":"de6a84b1-1846-4dd0-be7f-47a8872227ff","Type":"ContainerStarted","Data":"776ffd65e5f9d25fb0aabd5932a286dc6c16b7cc45cc3b53741f73dc28f94960"} Jan 21 15:47:07 crc kubenswrapper[4773]: W0121 15:47:07.790772 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode15b4c36_657d_47af_a103_b268a9727430.slice/crio-b73cac7215e18c754425c61959ced5ea9f600c4da0a2c7b5cbb9989f01bc4201 WatchSource:0}: Error finding container b73cac7215e18c754425c61959ced5ea9f600c4da0a2c7b5cbb9989f01bc4201: Status 404 returned error can't find the container with id b73cac7215e18c754425c61959ced5ea9f600c4da0a2c7b5cbb9989f01bc4201 Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.795888 4773 scope.go:117] "RemoveContainer" containerID="599e0869d6316a92a42f1b552d07a7c5804b159674b137b042200e74126d9506" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.798660 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.798815 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51eaf63a-c7b4-47eb-8357-e746bd703b64","Type":"ContainerStarted","Data":"9f60cde77726f6306ecedc80c1577069686296b7d6f37e8456fe72da241e3cd5"} Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.799098 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-l79d2" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.799589 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6x5cx" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.801009 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6bhfp" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.808465 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:47:07 crc kubenswrapper[4773]: W0121 15:47:07.816226 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cf09d70_1803_460f_ba8d_0434313796cf.slice/crio-1375bb58a3b7bcd7928d8fab95118d4d463c48ca84d5e0e7cef0633c40320155 WatchSource:0}: Error finding container 1375bb58a3b7bcd7928d8fab95118d4d463c48ca84d5e0e7cef0633c40320155: Status 404 returned error can't find the container with id 1375bb58a3b7bcd7928d8fab95118d4d463c48ca84d5e0e7cef0633c40320155 Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.817799 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-77kxr" podStartSLOduration=2.306255134 podStartE2EDuration="1m5.81778145s" podCreationTimestamp="2026-01-21 15:46:02 +0000 UTC" firstStartedPulling="2026-01-21 15:46:03.653828757 +0000 UTC m=+1328.578318379" lastFinishedPulling="2026-01-21 15:47:07.165355073 +0000 UTC m=+1392.089844695" observedRunningTime="2026-01-21 15:47:07.792261162 +0000 UTC m=+1392.716750804" watchObservedRunningTime="2026-01-21 15:47:07.81778145 +0000 UTC m=+1392.742271072" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.852473 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ssd99"] Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.864058 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ssd99"] Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.912874 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.921161 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.974398 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:47:07 crc kubenswrapper[4773]: E0121 15:47:07.975881 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2daf1511-7c23-47cc-900d-07ed88b573c3" containerName="init" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.975908 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2daf1511-7c23-47cc-900d-07ed88b573c3" containerName="init" Jan 21 15:47:07 crc kubenswrapper[4773]: E0121 15:47:07.975929 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eae6f1f-bc67-4acc-836b-68396e478669" containerName="keystone-bootstrap" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.975938 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eae6f1f-bc67-4acc-836b-68396e478669" containerName="keystone-bootstrap" Jan 21 15:47:07 crc kubenswrapper[4773]: E0121 15:47:07.975971 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a44a95-1489-4fc8-8cbc-8f82568dcdfd" containerName="neutron-db-sync" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.975977 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a44a95-1489-4fc8-8cbc-8f82568dcdfd" containerName="neutron-db-sync" Jan 21 15:47:07 crc kubenswrapper[4773]: E0121 15:47:07.975988 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc612e0c-df99-4189-96eb-ab85d3c20688" containerName="glance-httpd" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.975994 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc612e0c-df99-4189-96eb-ab85d3c20688" containerName="glance-httpd" Jan 21 15:47:07 crc kubenswrapper[4773]: E0121 15:47:07.976005 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12aa4f9-2fe2-4bfd-b764-3755131eb10a" containerName="placement-db-sync" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.976011 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12aa4f9-2fe2-4bfd-b764-3755131eb10a" containerName="placement-db-sync" Jan 21 15:47:07 crc kubenswrapper[4773]: E0121 15:47:07.976027 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2daf1511-7c23-47cc-900d-07ed88b573c3" containerName="dnsmasq-dns" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.976033 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2daf1511-7c23-47cc-900d-07ed88b573c3" containerName="dnsmasq-dns" Jan 21 15:47:07 crc kubenswrapper[4773]: E0121 15:47:07.976044 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc612e0c-df99-4189-96eb-ab85d3c20688" containerName="glance-log" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.976050 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc612e0c-df99-4189-96eb-ab85d3c20688" containerName="glance-log" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.976263 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eae6f1f-bc67-4acc-836b-68396e478669" containerName="keystone-bootstrap" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.976284 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a44a95-1489-4fc8-8cbc-8f82568dcdfd" containerName="neutron-db-sync" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.976297 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc612e0c-df99-4189-96eb-ab85d3c20688" containerName="glance-httpd" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.976311 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12aa4f9-2fe2-4bfd-b764-3755131eb10a" containerName="placement-db-sync" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.976327 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc612e0c-df99-4189-96eb-ab85d3c20688" containerName="glance-log" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.976336 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2daf1511-7c23-47cc-900d-07ed88b573c3" containerName="dnsmasq-dns" Jan 21 15:47:07 crc kubenswrapper[4773]: I0121 15:47:07.991236 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.005656 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.005863 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.031216 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.065790 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-thzmv"] Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.067580 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.090218 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-thzmv"] Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.130794 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cd9678444-nmk9w"] Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.157236 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.163554 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.163704 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.163572 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tm9bd" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.163739 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dghfb\" (UniqueName: \"kubernetes.io/projected/cb19b844-b560-4be2-8709-f78158c0eb36-kube-api-access-dghfb\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.163825 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.163915 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-logs\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.163961 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.163984 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-config-data\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.164014 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-scripts\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.170824 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.171085 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.184647 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cd9678444-nmk9w"] Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.185909 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.265762 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-config\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.265823 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.265881 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q542x\" (UniqueName: \"kubernetes.io/projected/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-kube-api-access-q542x\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.265923 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.265943 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dghfb\" (UniqueName: \"kubernetes.io/projected/cb19b844-b560-4be2-8709-f78158c0eb36-kube-api-access-dghfb\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.265959 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.265987 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266017 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sr5v\" (UniqueName: \"kubernetes.io/projected/133a51b0-d4c1-4515-b260-143df28703df-kube-api-access-6sr5v\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266042 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266097 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-ovndb-tls-certs\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266113 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-logs\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266129 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266148 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-config\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266171 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-httpd-config\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266188 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266211 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266228 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-combined-ca-bundle\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266245 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-config-data\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.266266 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-scripts\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.267408 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.269321 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-logs\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.277673 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-scripts\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.282485 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-config-data\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.284910 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.285011 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e6ddc15da45c425afe1609cfb31bc40a4ae0f7e1b60627fb6c6d646b1880744e/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.285509 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.289953 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.294524 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dghfb\" (UniqueName: \"kubernetes.io/projected/cb19b844-b560-4be2-8709-f78158c0eb36-kube-api-access-dghfb\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.351663 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5cf6c78dd-68gm6"] Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.353749 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.373418 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.373887 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.373993 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.373935 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.374241 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-x8xpq" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.387048 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.387131 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sr5v\" (UniqueName: \"kubernetes.io/projected/133a51b0-d4c1-4515-b260-143df28703df-kube-api-access-6sr5v\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.387189 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.387309 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-ovndb-tls-certs\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.387334 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.387354 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-config\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.387394 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-httpd-config\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.387418 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.387457 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-combined-ca-bundle\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.387517 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-config\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.387615 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q542x\" (UniqueName: \"kubernetes.io/projected/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-kube-api-access-q542x\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.388111 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.389086 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.389750 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.390420 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.390658 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-config\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.392555 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6cb88dccdd-v7jgc"] Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.395824 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.404431 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-config\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.420980 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.421206 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.421262 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.421213 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sf6hv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.421526 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.423339 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.424397 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cb88dccdd-v7jgc"] Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.425666 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-ovndb-tls-certs\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.426918 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q542x\" (UniqueName: \"kubernetes.io/projected/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-kube-api-access-q542x\") pod \"dnsmasq-dns-55f844cf75-thzmv\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.428357 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sr5v\" (UniqueName: \"kubernetes.io/projected/133a51b0-d4c1-4515-b260-143df28703df-kube-api-access-6sr5v\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.430267 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-httpd-config\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.434652 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-combined-ca-bundle\") pod \"neutron-cd9678444-nmk9w\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.441811 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.461649 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5cf6c78dd-68gm6"] Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.489846 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-config-data\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.489940 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-credential-keys\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.490109 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-combined-ca-bundle\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.490156 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-config-data\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.490174 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-internal-tls-certs\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.490298 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-combined-ca-bundle\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.490340 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-scripts\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.490418 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-public-tls-certs\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.490551 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54ed186-3f20-46df-8d62-e6a4daa84fed-logs\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.490832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-public-tls-certs\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.490862 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zwjv\" (UniqueName: \"kubernetes.io/projected/d5f9230a-2f00-48b8-bd84-4e080a4b907e-kube-api-access-4zwjv\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.490887 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-fernet-keys\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.490920 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-scripts\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.491038 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-internal-tls-certs\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.491082 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs9vx\" (UniqueName: \"kubernetes.io/projected/b54ed186-3f20-46df-8d62-e6a4daa84fed-kube-api-access-qs9vx\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.511072 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.592762 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-scripts\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.597868 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-public-tls-certs\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.598227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54ed186-3f20-46df-8d62-e6a4daa84fed-logs\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.598345 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-public-tls-certs\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.598435 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zwjv\" (UniqueName: \"kubernetes.io/projected/d5f9230a-2f00-48b8-bd84-4e080a4b907e-kube-api-access-4zwjv\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.598538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-scripts\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.598558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-fernet-keys\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.598762 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-scripts\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.598892 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54ed186-3f20-46df-8d62-e6a4daa84fed-logs\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.598909 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-internal-tls-certs\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.599079 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs9vx\" (UniqueName: \"kubernetes.io/projected/b54ed186-3f20-46df-8d62-e6a4daa84fed-kube-api-access-qs9vx\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.599380 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-config-data\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.599485 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-credential-keys\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.599713 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-combined-ca-bundle\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.599829 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-config-data\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.599929 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-internal-tls-certs\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.600089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-combined-ca-bundle\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.614633 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-fernet-keys\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.621743 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.624119 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs9vx\" (UniqueName: \"kubernetes.io/projected/b54ed186-3f20-46df-8d62-e6a4daa84fed-kube-api-access-qs9vx\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.626546 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-internal-tls-certs\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.627150 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-config-data\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.628665 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-combined-ca-bundle\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.628767 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-internal-tls-certs\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.629145 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-combined-ca-bundle\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.629553 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b54ed186-3f20-46df-8d62-e6a4daa84fed-public-tls-certs\") pod \"placement-5cf6c78dd-68gm6\" (UID: \"b54ed186-3f20-46df-8d62-e6a4daa84fed\") " pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.629886 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zwjv\" (UniqueName: \"kubernetes.io/projected/d5f9230a-2f00-48b8-bd84-4e080a4b907e-kube-api-access-4zwjv\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.630297 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-public-tls-certs\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.633478 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-config-data\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.633726 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.634036 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-credential-keys\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.660044 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5f9230a-2f00-48b8-bd84-4e080a4b907e-scripts\") pod \"keystone-6cb88dccdd-v7jgc\" (UID: \"d5f9230a-2f00-48b8-bd84-4e080a4b907e\") " pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.788586 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.789405 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.849973 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cf09d70-1803-460f-ba8d-0434313796cf","Type":"ContainerStarted","Data":"1375bb58a3b7bcd7928d8fab95118d4d463c48ca84d5e0e7cef0633c40320155"} Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.883214 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wnbml" event={"ID":"e15b4c36-657d-47af-a103-b268a9727430","Type":"ContainerStarted","Data":"3f780e2e7efd070f6ca156eac46e0794a60b1878fd3c276a4f4c925e35aaa3f3"} Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.883275 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wnbml" event={"ID":"e15b4c36-657d-47af-a103-b268a9727430","Type":"ContainerStarted","Data":"b73cac7215e18c754425c61959ced5ea9f600c4da0a2c7b5cbb9989f01bc4201"} Jan 21 15:47:08 crc kubenswrapper[4773]: I0121 15:47:08.995831 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-wnbml" podStartSLOduration=12.99580382 podStartE2EDuration="12.99580382s" podCreationTimestamp="2026-01-21 15:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:08.915896591 +0000 UTC m=+1393.840386213" watchObservedRunningTime="2026-01-21 15:47:08.99580382 +0000 UTC m=+1393.920293442" Jan 21 15:47:09 crc kubenswrapper[4773]: I0121 15:47:09.441969 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2daf1511-7c23-47cc-900d-07ed88b573c3" path="/var/lib/kubelet/pods/2daf1511-7c23-47cc-900d-07ed88b573c3/volumes" Jan 21 15:47:09 crc kubenswrapper[4773]: I0121 15:47:09.444153 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc612e0c-df99-4189-96eb-ab85d3c20688" path="/var/lib/kubelet/pods/fc612e0c-df99-4189-96eb-ab85d3c20688/volumes" Jan 21 15:47:09 crc kubenswrapper[4773]: I0121 15:47:09.447839 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-thzmv"] Jan 21 15:47:09 crc kubenswrapper[4773]: I0121 15:47:09.822544 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:47:09 crc kubenswrapper[4773]: W0121 15:47:09.855518 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb19b844_b560_4be2_8709_f78158c0eb36.slice/crio-2d2882707873a5f0b38a8f1aa367811fb5d7d432295dde9ace28b3976e867b5f WatchSource:0}: Error finding container 2d2882707873a5f0b38a8f1aa367811fb5d7d432295dde9ace28b3976e867b5f: Status 404 returned error can't find the container with id 2d2882707873a5f0b38a8f1aa367811fb5d7d432295dde9ace28b3976e867b5f Jan 21 15:47:09 crc kubenswrapper[4773]: I0121 15:47:09.923256 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb19b844-b560-4be2-8709-f78158c0eb36","Type":"ContainerStarted","Data":"2d2882707873a5f0b38a8f1aa367811fb5d7d432295dde9ace28b3976e867b5f"} Jan 21 15:47:09 crc kubenswrapper[4773]: I0121 15:47:09.927246 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" event={"ID":"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5","Type":"ContainerStarted","Data":"b3029c0d12356ea5d4c378d6b9ef93053a1da67a6b339e167927d5e1455ca7ca"} Jan 21 15:47:09 crc kubenswrapper[4773]: I0121 15:47:09.943293 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cd9678444-nmk9w"] Jan 21 15:47:09 crc kubenswrapper[4773]: I0121 15:47:09.958233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-znjg2" event={"ID":"5c0beb28-481c-4507-94c2-d644e4faf5ab","Type":"ContainerStarted","Data":"4db387886c94d9295b7d0446bd4a1606255259c595aabc38c5d6e12d844dc3b8"} Jan 21 15:47:10 crc kubenswrapper[4773]: I0121 15:47:10.016817 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cf09d70-1803-460f-ba8d-0434313796cf","Type":"ContainerStarted","Data":"c7d16007bee633cfb0a063e976216e8c7a67a364a72cd792889055d53bed6133"} Jan 21 15:47:10 crc kubenswrapper[4773]: I0121 15:47:10.034478 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-znjg2" podStartSLOduration=4.949653819 podStartE2EDuration="1m8.034457713s" podCreationTimestamp="2026-01-21 15:46:02 +0000 UTC" firstStartedPulling="2026-01-21 15:46:04.10735562 +0000 UTC m=+1329.031845242" lastFinishedPulling="2026-01-21 15:47:07.192159514 +0000 UTC m=+1392.116649136" observedRunningTime="2026-01-21 15:47:10.000099989 +0000 UTC m=+1394.924589621" watchObservedRunningTime="2026-01-21 15:47:10.034457713 +0000 UTC m=+1394.958947335" Jan 21 15:47:10 crc kubenswrapper[4773]: I0121 15:47:10.063651 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5cf6c78dd-68gm6"] Jan 21 15:47:10 crc kubenswrapper[4773]: I0121 15:47:10.085264 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cb88dccdd-v7jgc"] Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.049059 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cb88dccdd-v7jgc" event={"ID":"d5f9230a-2f00-48b8-bd84-4e080a4b907e","Type":"ContainerStarted","Data":"8c6445709c9c45186b0b9c91a155b5bbecec878ab3d990caceca88a1b2856c3f"} Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.049922 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cb88dccdd-v7jgc" event={"ID":"d5f9230a-2f00-48b8-bd84-4e080a4b907e","Type":"ContainerStarted","Data":"49b213c8a6e66c7cb88ca8a2a0f852eebc37e582be4a13649db47de48b7018b1"} Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.052930 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.060883 4773 generic.go:334] "Generic (PLEG): container finished" podID="3cd3f2b8-c365-4845-8508-2403d3b1f03e" containerID="c67b185dffa20fe1a4a76944ac60a9359ff42fdac69e7b491c802b48b2b8a22d" exitCode=0 Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.060992 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2bxp" event={"ID":"3cd3f2b8-c365-4845-8508-2403d3b1f03e","Type":"ContainerDied","Data":"c67b185dffa20fe1a4a76944ac60a9359ff42fdac69e7b491c802b48b2b8a22d"} Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.067880 4773 generic.go:334] "Generic (PLEG): container finished" podID="d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" containerID="d5c42b4f53fe8ce8ada82d79f4bf18fbf9e16ed37442f81ce55014421766bf61" exitCode=0 Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.067974 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" event={"ID":"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5","Type":"ContainerDied","Data":"d5c42b4f53fe8ce8ada82d79f4bf18fbf9e16ed37442f81ce55014421766bf61"} Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.089165 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6cb88dccdd-v7jgc" podStartSLOduration=3.089139227 podStartE2EDuration="3.089139227s" podCreationTimestamp="2026-01-21 15:47:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:11.083233758 +0000 UTC m=+1396.007723380" watchObservedRunningTime="2026-01-21 15:47:11.089139227 +0000 UTC m=+1396.013628849" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.112144 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd9678444-nmk9w" event={"ID":"133a51b0-d4c1-4515-b260-143df28703df","Type":"ContainerStarted","Data":"4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a"} Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.112193 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd9678444-nmk9w" event={"ID":"133a51b0-d4c1-4515-b260-143df28703df","Type":"ContainerStarted","Data":"196e08daaf69876c023c03ec9602563fa9350b526f68381f33630239f39eb746"} Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.113017 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.115021 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cf6c78dd-68gm6" event={"ID":"b54ed186-3f20-46df-8d62-e6a4daa84fed","Type":"ContainerStarted","Data":"649357b992ff895618fa0bf592e17cca4524652c87c79766e42e1c6f82392a06"} Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.131485 4773 generic.go:334] "Generic (PLEG): container finished" podID="e15b4c36-657d-47af-a103-b268a9727430" containerID="3f780e2e7efd070f6ca156eac46e0794a60b1878fd3c276a4f4c925e35aaa3f3" exitCode=0 Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.131536 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wnbml" event={"ID":"e15b4c36-657d-47af-a103-b268a9727430","Type":"ContainerDied","Data":"3f780e2e7efd070f6ca156eac46e0794a60b1878fd3c276a4f4c925e35aaa3f3"} Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.211855 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cd9678444-nmk9w" podStartSLOduration=3.211820556 podStartE2EDuration="3.211820556s" podCreationTimestamp="2026-01-21 15:47:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:11.183201326 +0000 UTC m=+1396.107690948" watchObservedRunningTime="2026-01-21 15:47:11.211820556 +0000 UTC m=+1396.136310188" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.838934 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-746866d6b5-jbp68"] Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.843310 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.848193 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.876422 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.889283 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-746866d6b5-jbp68"] Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.936374 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-ovndb-tls-certs\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.936426 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-public-tls-certs\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.936460 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7b55\" (UniqueName: \"kubernetes.io/projected/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-kube-api-access-z7b55\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.936524 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-config\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.936539 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-combined-ca-bundle\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.936566 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-httpd-config\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:11 crc kubenswrapper[4773]: I0121 15:47:11.936582 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-internal-tls-certs\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.039601 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-public-tls-certs\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.040145 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7b55\" (UniqueName: \"kubernetes.io/projected/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-kube-api-access-z7b55\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.040374 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-config\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.040507 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-combined-ca-bundle\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.040670 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-httpd-config\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.040861 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-internal-tls-certs\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.041398 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-ovndb-tls-certs\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.046642 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-httpd-config\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.048077 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-combined-ca-bundle\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.051439 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-ovndb-tls-certs\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.053371 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-public-tls-certs\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.053540 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-config\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.056246 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-internal-tls-certs\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.078157 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7b55\" (UniqueName: \"kubernetes.io/projected/711d6a0d-d24f-4d48-b73b-3b5418fe12bf-kube-api-access-z7b55\") pod \"neutron-746866d6b5-jbp68\" (UID: \"711d6a0d-d24f-4d48-b73b-3b5418fe12bf\") " pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.169844 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" event={"ID":"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5","Type":"ContainerStarted","Data":"3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a"} Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.172127 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.183578 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd9678444-nmk9w" event={"ID":"133a51b0-d4c1-4515-b260-143df28703df","Type":"ContainerStarted","Data":"33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a"} Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.193080 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" podStartSLOduration=5.193064206 podStartE2EDuration="5.193064206s" podCreationTimestamp="2026-01-21 15:47:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:12.192203072 +0000 UTC m=+1397.116692704" watchObservedRunningTime="2026-01-21 15:47:12.193064206 +0000 UTC m=+1397.117553828" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.208528 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cf6c78dd-68gm6" event={"ID":"b54ed186-3f20-46df-8d62-e6a4daa84fed","Type":"ContainerStarted","Data":"98e588f79786cd59d621b3edc0bb9e207372371eddbb0ef7f02badc8f65d2a4d"} Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.208575 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cf6c78dd-68gm6" event={"ID":"b54ed186-3f20-46df-8d62-e6a4daa84fed","Type":"ContainerStarted","Data":"c4dcb481f121f6b4388886895c545195da7a33ea2ef458d28f75d122424876a8"} Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.208901 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.209054 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.224894 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.244090 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5cf6c78dd-68gm6" podStartSLOduration=4.244066077 podStartE2EDuration="4.244066077s" podCreationTimestamp="2026-01-21 15:47:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:12.240083729 +0000 UTC m=+1397.164573351" watchObservedRunningTime="2026-01-21 15:47:12.244066077 +0000 UTC m=+1397.168555709" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.265586 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cf09d70-1803-460f-ba8d-0434313796cf","Type":"ContainerStarted","Data":"58e704fd2700629523c14d0263c082cfa2eeefdb8d62ab5033503ff2a081525f"} Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.311799 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.311783877 podStartE2EDuration="14.311783877s" podCreationTimestamp="2026-01-21 15:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:12.292853139 +0000 UTC m=+1397.217342761" watchObservedRunningTime="2026-01-21 15:47:12.311783877 +0000 UTC m=+1397.236273499" Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.325923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb19b844-b560-4be2-8709-f78158c0eb36","Type":"ContainerStarted","Data":"b53978a831c8cdfbd46b0bc68784acf220e232095b7967ac9b5dc22720dc77c5"} Jan 21 15:47:12 crc kubenswrapper[4773]: I0121 15:47:12.999115 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.074925 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8bhs\" (UniqueName: \"kubernetes.io/projected/3cd3f2b8-c365-4845-8508-2403d3b1f03e-kube-api-access-z8bhs\") pod \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.075085 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-db-sync-config-data\") pod \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.075511 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-combined-ca-bundle\") pod \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\" (UID: \"3cd3f2b8-c365-4845-8508-2403d3b1f03e\") " Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.082233 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3cd3f2b8-c365-4845-8508-2403d3b1f03e" (UID: "3cd3f2b8-c365-4845-8508-2403d3b1f03e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.083970 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd3f2b8-c365-4845-8508-2403d3b1f03e-kube-api-access-z8bhs" (OuterVolumeSpecName: "kube-api-access-z8bhs") pod "3cd3f2b8-c365-4845-8508-2403d3b1f03e" (UID: "3cd3f2b8-c365-4845-8508-2403d3b1f03e"). InnerVolumeSpecName "kube-api-access-z8bhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.148912 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cd3f2b8-c365-4845-8508-2403d3b1f03e" (UID: "3cd3f2b8-c365-4845-8508-2403d3b1f03e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.178077 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.178361 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8bhs\" (UniqueName: \"kubernetes.io/projected/3cd3f2b8-c365-4845-8508-2403d3b1f03e-kube-api-access-z8bhs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.178372 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3cd3f2b8-c365-4845-8508-2403d3b1f03e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.179049 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wnbml" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.279659 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e15b4c36-657d-47af-a103-b268a9727430-operator-scripts\") pod \"e15b4c36-657d-47af-a103-b268a9727430\" (UID: \"e15b4c36-657d-47af-a103-b268a9727430\") " Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.279884 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwr6z\" (UniqueName: \"kubernetes.io/projected/e15b4c36-657d-47af-a103-b268a9727430-kube-api-access-bwr6z\") pod \"e15b4c36-657d-47af-a103-b268a9727430\" (UID: \"e15b4c36-657d-47af-a103-b268a9727430\") " Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.280253 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e15b4c36-657d-47af-a103-b268a9727430-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e15b4c36-657d-47af-a103-b268a9727430" (UID: "e15b4c36-657d-47af-a103-b268a9727430"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.280537 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e15b4c36-657d-47af-a103-b268a9727430-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.285574 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e15b4c36-657d-47af-a103-b268a9727430-kube-api-access-bwr6z" (OuterVolumeSpecName: "kube-api-access-bwr6z") pod "e15b4c36-657d-47af-a103-b268a9727430" (UID: "e15b4c36-657d-47af-a103-b268a9727430"). InnerVolumeSpecName "kube-api-access-bwr6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.307889 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-746866d6b5-jbp68"] Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.356189 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wnbml" event={"ID":"e15b4c36-657d-47af-a103-b268a9727430","Type":"ContainerDied","Data":"b73cac7215e18c754425c61959ced5ea9f600c4da0a2c7b5cbb9989f01bc4201"} Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.356229 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b73cac7215e18c754425c61959ced5ea9f600c4da0a2c7b5cbb9989f01bc4201" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.356299 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wnbml" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.362960 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2bxp" event={"ID":"3cd3f2b8-c365-4845-8508-2403d3b1f03e","Type":"ContainerDied","Data":"8196ff7f96be10da8345381f2dbc075b8a9493a30a2ae8ddec8c79a7c5d62471"} Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.363015 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8196ff7f96be10da8345381f2dbc075b8a9493a30a2ae8ddec8c79a7c5d62471" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.363235 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2bxp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.375053 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-746866d6b5-jbp68" event={"ID":"711d6a0d-d24f-4d48-b73b-3b5418fe12bf","Type":"ContainerStarted","Data":"2d48edbe5a29566808076a5055efd48a1521e34b5791641cf2bc7e0257a89732"} Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.382623 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwr6z\" (UniqueName: \"kubernetes.io/projected/e15b4c36-657d-47af-a103-b268a9727430-kube-api-access-bwr6z\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.447458 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-794b46c66c-dj5lp"] Jan 21 15:47:13 crc kubenswrapper[4773]: E0121 15:47:13.447846 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd3f2b8-c365-4845-8508-2403d3b1f03e" containerName="barbican-db-sync" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.447864 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd3f2b8-c365-4845-8508-2403d3b1f03e" containerName="barbican-db-sync" Jan 21 15:47:13 crc kubenswrapper[4773]: E0121 15:47:13.447898 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15b4c36-657d-47af-a103-b268a9727430" containerName="mariadb-account-create-update" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.447906 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15b4c36-657d-47af-a103-b268a9727430" containerName="mariadb-account-create-update" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.448106 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15b4c36-657d-47af-a103-b268a9727430" containerName="mariadb-account-create-update" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.448124 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd3f2b8-c365-4845-8508-2403d3b1f03e" containerName="barbican-db-sync" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.471442 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.475653 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.475769 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rwbpc" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.476481 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.476879 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-794b46c66c-dj5lp"] Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.529654 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5d45c98f8b-b4vzj"] Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.538914 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.544539 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.552651 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d45c98f8b-b4vzj"] Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.573764 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-thzmv"] Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.709014 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d5b884-1964-4585-a6b4-bd7813ee52c8-combined-ca-bundle\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.709134 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d5b884-1964-4585-a6b4-bd7813ee52c8-logs\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.709199 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-config-data-custom\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.709346 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr5h9\" (UniqueName: \"kubernetes.io/projected/e7d5b884-1964-4585-a6b4-bd7813ee52c8-kube-api-access-wr5h9\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.709408 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-config-data\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.709750 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d5b884-1964-4585-a6b4-bd7813ee52c8-config-data-custom\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.709780 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-logs\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.709856 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhcmx\" (UniqueName: \"kubernetes.io/projected/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-kube-api-access-hhcmx\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.710043 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-combined-ca-bundle\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.710134 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d5b884-1964-4585-a6b4-bd7813ee52c8-config-data\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.714578 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-hmr6x"] Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.716476 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.818856 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-logs\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.819231 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhcmx\" (UniqueName: \"kubernetes.io/projected/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-kube-api-access-hhcmx\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.819279 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-combined-ca-bundle\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.819327 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d5b884-1964-4585-a6b4-bd7813ee52c8-config-data\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.819365 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-logs\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.819404 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d5b884-1964-4585-a6b4-bd7813ee52c8-combined-ca-bundle\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.819455 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d5b884-1964-4585-a6b4-bd7813ee52c8-logs\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.819494 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-config-data-custom\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.819584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr5h9\" (UniqueName: \"kubernetes.io/projected/e7d5b884-1964-4585-a6b4-bd7813ee52c8-kube-api-access-wr5h9\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.819638 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-config-data\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.819757 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d5b884-1964-4585-a6b4-bd7813ee52c8-config-data-custom\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.820276 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d5b884-1964-4585-a6b4-bd7813ee52c8-logs\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.822810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-combined-ca-bundle\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.838834 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-hmr6x"] Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.847915 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-config-data-custom\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.857518 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d5b884-1964-4585-a6b4-bd7813ee52c8-config-data-custom\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.862177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d5b884-1964-4585-a6b4-bd7813ee52c8-combined-ca-bundle\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.878828 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-config-data\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.905534 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhcmx\" (UniqueName: \"kubernetes.io/projected/36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d-kube-api-access-hhcmx\") pod \"barbican-keystone-listener-5d45c98f8b-b4vzj\" (UID: \"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d\") " pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.925321 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr5h9\" (UniqueName: \"kubernetes.io/projected/e7d5b884-1964-4585-a6b4-bd7813ee52c8-kube-api-access-wr5h9\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.927040 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d5b884-1964-4585-a6b4-bd7813ee52c8-config-data\") pod \"barbican-worker-794b46c66c-dj5lp\" (UID: \"e7d5b884-1964-4585-a6b4-bd7813ee52c8\") " pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.947604 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-config\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.947674 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-svc\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.947731 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm6gd\" (UniqueName: \"kubernetes.io/projected/b2696559-c843-4ec6-a347-f91ae2c790d3-kube-api-access-gm6gd\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.948053 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.948120 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.948155 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.967755 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-668bc45dd4-pd2gr"] Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.969820 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.985228 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 21 15:47:13 crc kubenswrapper[4773]: I0121 15:47:13.995448 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-668bc45dd4-pd2gr"] Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.049822 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-config\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.049901 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-svc\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.049931 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm6gd\" (UniqueName: \"kubernetes.io/projected/b2696559-c843-4ec6-a347-f91ae2c790d3-kube-api-access-gm6gd\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.050076 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.050148 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.050179 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.050854 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-config\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.051490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-svc\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.056421 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.057370 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.061394 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.107800 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm6gd\" (UniqueName: \"kubernetes.io/projected/b2696559-c843-4ec6-a347-f91ae2c790d3-kube-api-access-gm6gd\") pod \"dnsmasq-dns-85ff748b95-hmr6x\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.131378 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-794b46c66c-dj5lp" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.152808 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.152901 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data-custom\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.152950 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckr9b\" (UniqueName: \"kubernetes.io/projected/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-kube-api-access-ckr9b\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.153119 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-combined-ca-bundle\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.153643 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-logs\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.194174 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.256204 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-combined-ca-bundle\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.256568 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-logs\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.256678 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.258589 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data-custom\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.258629 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckr9b\" (UniqueName: \"kubernetes.io/projected/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-kube-api-access-ckr9b\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.260188 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-logs\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.268633 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data-custom\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.270909 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.282753 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-combined-ca-bundle\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.286207 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckr9b\" (UniqueName: \"kubernetes.io/projected/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-kube-api-access-ckr9b\") pod \"barbican-api-668bc45dd4-pd2gr\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.326051 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.360548 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.433274 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb19b844-b560-4be2-8709-f78158c0eb36","Type":"ContainerStarted","Data":"cdc153495a400da1fe5074cd3f66136a2bf1ea23e35cfac649ae0bd5759b7bec"} Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.453900 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-746866d6b5-jbp68" event={"ID":"711d6a0d-d24f-4d48-b73b-3b5418fe12bf","Type":"ContainerStarted","Data":"62b357bb051997c55c6c42587cd38774747c61dad3b63c684e9b14a07a5fa445"} Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.454098 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" podUID="d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" containerName="dnsmasq-dns" containerID="cri-o://3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a" gracePeriod=10 Jan 21 15:47:14 crc kubenswrapper[4773]: I0121 15:47:14.487155 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.487132609 podStartE2EDuration="7.487132609s" podCreationTimestamp="2026-01-21 15:47:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:14.467534572 +0000 UTC m=+1399.392024194" watchObservedRunningTime="2026-01-21 15:47:14.487132609 +0000 UTC m=+1399.411622241" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.149512 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-794b46c66c-dj5lp"] Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.190764 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d45c98f8b-b4vzj"] Jan 21 15:47:15 crc kubenswrapper[4773]: W0121 15:47:15.198353 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36d5f91b_6476_4d00_a9ed_fe1d2b4fe36d.slice/crio-e1ecdaa40a5776e9883a203e00ef5cf0ab415ba53e5d00e86831a863bf4198db WatchSource:0}: Error finding container e1ecdaa40a5776e9883a203e00ef5cf0ab415ba53e5d00e86831a863bf4198db: Status 404 returned error can't find the container with id e1ecdaa40a5776e9883a203e00ef5cf0ab415ba53e5d00e86831a863bf4198db Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.291025 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.384433 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-swift-storage-0\") pod \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.384503 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-config\") pod \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.384610 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-svc\") pod \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.384660 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q542x\" (UniqueName: \"kubernetes.io/projected/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-kube-api-access-q542x\") pod \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.384674 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-nb\") pod \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.384719 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-sb\") pod \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\" (UID: \"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5\") " Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.400193 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-kube-api-access-q542x" (OuterVolumeSpecName: "kube-api-access-q542x") pod "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" (UID: "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5"). InnerVolumeSpecName "kube-api-access-q542x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.456447 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" (UID: "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.473865 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" (UID: "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.479626 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" (UID: "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.487049 4773 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.487083 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.487096 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q542x\" (UniqueName: \"kubernetes.io/projected/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-kube-api-access-q542x\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.487108 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.489196 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" (UID: "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.498135 4773 generic.go:334] "Generic (PLEG): container finished" podID="d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" containerID="3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a" exitCode=0 Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.498261 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.519783 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-746866d6b5-jbp68" podStartSLOduration=4.51976584 podStartE2EDuration="4.51976584s" podCreationTimestamp="2026-01-21 15:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:15.511188758 +0000 UTC m=+1400.435678390" watchObservedRunningTime="2026-01-21 15:47:15.51976584 +0000 UTC m=+1400.444255472" Jan 21 15:47:15 crc kubenswrapper[4773]: W0121 15:47:15.527301 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30c0bedd_4d99_4e6e_9276_7dcacb65b18f.slice/crio-e13007318f90a0c91eed3026a5b6da32c72b83c06904d6573bbec7943124769e WatchSource:0}: Error finding container e13007318f90a0c91eed3026a5b6da32c72b83c06904d6573bbec7943124769e: Status 404 returned error can't find the container with id e13007318f90a0c91eed3026a5b6da32c72b83c06904d6573bbec7943124769e Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.543041 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.543078 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-746866d6b5-jbp68" event={"ID":"711d6a0d-d24f-4d48-b73b-3b5418fe12bf","Type":"ContainerStarted","Data":"9ce068cf0ce7b372a6793ecc9990c68730b02fac7975098c2b44f5270138f66a"} Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.543097 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" event={"ID":"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5","Type":"ContainerDied","Data":"3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a"} Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.543122 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-thzmv" event={"ID":"d7e8e044-111a-40a3-9c3a-c5fe946b9bb5","Type":"ContainerDied","Data":"b3029c0d12356ea5d4c378d6b9ef93053a1da67a6b339e167927d5e1455ca7ca"} Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.543131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" event={"ID":"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d","Type":"ContainerStarted","Data":"e1ecdaa40a5776e9883a203e00ef5cf0ab415ba53e5d00e86831a863bf4198db"} Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.543146 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-794b46c66c-dj5lp" event={"ID":"e7d5b884-1964-4585-a6b4-bd7813ee52c8","Type":"ContainerStarted","Data":"4f2ba577cf5ccc24c5cc758db839f3775c734fd8e8b2c83b1154729428512987"} Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.543174 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-668bc45dd4-pd2gr"] Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.543199 4773 scope.go:117] "RemoveContainer" containerID="3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.549215 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-config" (OuterVolumeSpecName: "config") pod "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" (UID: "d7e8e044-111a-40a3-9c3a-c5fe946b9bb5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.580038 4773 scope.go:117] "RemoveContainer" containerID="d5c42b4f53fe8ce8ada82d79f4bf18fbf9e16ed37442f81ce55014421766bf61" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.588765 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.588803 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.612018 4773 scope.go:117] "RemoveContainer" containerID="3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a" Jan 21 15:47:15 crc kubenswrapper[4773]: E0121 15:47:15.612436 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a\": container with ID starting with 3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a not found: ID does not exist" containerID="3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.612467 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a"} err="failed to get container status \"3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a\": rpc error: code = NotFound desc = could not find container \"3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a\": container with ID starting with 3a9a615a16eb22b46e41b0b5b64a1319c99a9690474e45b9323b8a22ca535b2a not found: ID does not exist" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.612487 4773 scope.go:117] "RemoveContainer" containerID="d5c42b4f53fe8ce8ada82d79f4bf18fbf9e16ed37442f81ce55014421766bf61" Jan 21 15:47:15 crc kubenswrapper[4773]: E0121 15:47:15.612724 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c42b4f53fe8ce8ada82d79f4bf18fbf9e16ed37442f81ce55014421766bf61\": container with ID starting with d5c42b4f53fe8ce8ada82d79f4bf18fbf9e16ed37442f81ce55014421766bf61 not found: ID does not exist" containerID="d5c42b4f53fe8ce8ada82d79f4bf18fbf9e16ed37442f81ce55014421766bf61" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.612748 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c42b4f53fe8ce8ada82d79f4bf18fbf9e16ed37442f81ce55014421766bf61"} err="failed to get container status \"d5c42b4f53fe8ce8ada82d79f4bf18fbf9e16ed37442f81ce55014421766bf61\": rpc error: code = NotFound desc = could not find container \"d5c42b4f53fe8ce8ada82d79f4bf18fbf9e16ed37442f81ce55014421766bf61\": container with ID starting with d5c42b4f53fe8ce8ada82d79f4bf18fbf9e16ed37442f81ce55014421766bf61 not found: ID does not exist" Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.771943 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-hmr6x"] Jan 21 15:47:15 crc kubenswrapper[4773]: W0121 15:47:15.788083 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2696559_c843_4ec6_a347_f91ae2c790d3.slice/crio-b54f93c4f02dd14dcb6fbe7a97f4f8c0c08086996fd70bd207aaff2f7017af17 WatchSource:0}: Error finding container b54f93c4f02dd14dcb6fbe7a97f4f8c0c08086996fd70bd207aaff2f7017af17: Status 404 returned error can't find the container with id b54f93c4f02dd14dcb6fbe7a97f4f8c0c08086996fd70bd207aaff2f7017af17 Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.973614 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-thzmv"] Jan 21 15:47:15 crc kubenswrapper[4773]: I0121 15:47:15.988677 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-thzmv"] Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.529485 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2696559-c843-4ec6-a347-f91ae2c790d3" containerID="dd7f972c369b04e57e6c5940c12fadfc33324a7028f191963036eb5e831ceecb" exitCode=0 Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.529660 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" event={"ID":"b2696559-c843-4ec6-a347-f91ae2c790d3","Type":"ContainerDied","Data":"dd7f972c369b04e57e6c5940c12fadfc33324a7028f191963036eb5e831ceecb"} Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.529912 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" event={"ID":"b2696559-c843-4ec6-a347-f91ae2c790d3","Type":"ContainerStarted","Data":"b54f93c4f02dd14dcb6fbe7a97f4f8c0c08086996fd70bd207aaff2f7017af17"} Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.532200 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-668bc45dd4-pd2gr" event={"ID":"30c0bedd-4d99-4e6e-9276-7dcacb65b18f","Type":"ContainerStarted","Data":"cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac"} Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.532259 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-668bc45dd4-pd2gr" event={"ID":"30c0bedd-4d99-4e6e-9276-7dcacb65b18f","Type":"ContainerStarted","Data":"e13007318f90a0c91eed3026a5b6da32c72b83c06904d6573bbec7943124769e"} Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.904047 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5cf4556c4-hwkr9"] Jan 21 15:47:16 crc kubenswrapper[4773]: E0121 15:47:16.904452 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" containerName="dnsmasq-dns" Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.904470 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" containerName="dnsmasq-dns" Jan 21 15:47:16 crc kubenswrapper[4773]: E0121 15:47:16.904480 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" containerName="init" Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.904486 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" containerName="init" Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.904686 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" containerName="dnsmasq-dns" Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.905767 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.908351 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.912583 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 21 15:47:16 crc kubenswrapper[4773]: I0121 15:47:16.938780 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cf4556c4-hwkr9"] Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.097303 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-combined-ca-bundle\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.097671 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smsq4\" (UniqueName: \"kubernetes.io/projected/8e7d6f73-a63d-40a4-acda-12edb288ec53-kube-api-access-smsq4\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.097778 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-public-tls-certs\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.097808 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-config-data-custom\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.097829 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e7d6f73-a63d-40a4-acda-12edb288ec53-logs\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.097895 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-config-data\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.097913 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-internal-tls-certs\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.199512 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-combined-ca-bundle\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.199599 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smsq4\" (UniqueName: \"kubernetes.io/projected/8e7d6f73-a63d-40a4-acda-12edb288ec53-kube-api-access-smsq4\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.199679 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-public-tls-certs\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.199745 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-config-data-custom\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.199776 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e7d6f73-a63d-40a4-acda-12edb288ec53-logs\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.199874 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-config-data\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.199906 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-internal-tls-certs\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.201357 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e7d6f73-a63d-40a4-acda-12edb288ec53-logs\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.210215 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-config-data-custom\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.211214 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-combined-ca-bundle\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.212659 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-config-data\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.213362 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-public-tls-certs\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.215597 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e7d6f73-a63d-40a4-acda-12edb288ec53-internal-tls-certs\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.227034 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smsq4\" (UniqueName: \"kubernetes.io/projected/8e7d6f73-a63d-40a4-acda-12edb288ec53-kube-api-access-smsq4\") pod \"barbican-api-5cf4556c4-hwkr9\" (UID: \"8e7d6f73-a63d-40a4-acda-12edb288ec53\") " pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.407767 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e8e044-111a-40a3-9c3a-c5fe946b9bb5" path="/var/lib/kubelet/pods/d7e8e044-111a-40a3-9c3a-c5fe946b9bb5/volumes" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.527350 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.550261 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" event={"ID":"b2696559-c843-4ec6-a347-f91ae2c790d3","Type":"ContainerStarted","Data":"54db186a5a591064e5aeaaa5b58b527dc40d68aba19e9bc384d839b68250d7fd"} Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.550476 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.574973 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-668bc45dd4-pd2gr" event={"ID":"30c0bedd-4d99-4e6e-9276-7dcacb65b18f","Type":"ContainerStarted","Data":"fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8"} Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.576480 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.576526 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.582463 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" podStartSLOduration=4.582437261 podStartE2EDuration="4.582437261s" podCreationTimestamp="2026-01-21 15:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:17.572065852 +0000 UTC m=+1402.496555474" watchObservedRunningTime="2026-01-21 15:47:17.582437261 +0000 UTC m=+1402.506926883" Jan 21 15:47:17 crc kubenswrapper[4773]: I0121 15:47:17.610991 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-668bc45dd4-pd2gr" podStartSLOduration=4.610963988 podStartE2EDuration="4.610963988s" podCreationTimestamp="2026-01-21 15:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:17.597717802 +0000 UTC m=+1402.522207454" watchObservedRunningTime="2026-01-21 15:47:17.610963988 +0000 UTC m=+1402.535453610" Jan 21 15:47:18 crc kubenswrapper[4773]: I0121 15:47:18.042683 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cf4556c4-hwkr9"] Jan 21 15:47:18 crc kubenswrapper[4773]: W0121 15:47:18.045856 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e7d6f73_a63d_40a4_acda_12edb288ec53.slice/crio-72f0829f272001f473b292907c301c8f87febf40b8f7d8a5a81c33f434282874 WatchSource:0}: Error finding container 72f0829f272001f473b292907c301c8f87febf40b8f7d8a5a81c33f434282874: Status 404 returned error can't find the container with id 72f0829f272001f473b292907c301c8f87febf40b8f7d8a5a81c33f434282874 Jan 21 15:47:18 crc kubenswrapper[4773]: I0121 15:47:18.058391 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wnbml"] Jan 21 15:47:18 crc kubenswrapper[4773]: I0121 15:47:18.072054 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wnbml"] Jan 21 15:47:18 crc kubenswrapper[4773]: I0121 15:47:18.592591 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cf4556c4-hwkr9" event={"ID":"8e7d6f73-a63d-40a4-acda-12edb288ec53","Type":"ContainerStarted","Data":"e9122529451bd0e75703e9c921b0f4a4f55fad8ae2a7a04793749235d7c33c01"} Jan 21 15:47:18 crc kubenswrapper[4773]: I0121 15:47:18.592655 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cf4556c4-hwkr9" event={"ID":"8e7d6f73-a63d-40a4-acda-12edb288ec53","Type":"ContainerStarted","Data":"72f0829f272001f473b292907c301c8f87febf40b8f7d8a5a81c33f434282874"} Jan 21 15:47:18 crc kubenswrapper[4773]: I0121 15:47:18.634529 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 21 15:47:18 crc kubenswrapper[4773]: I0121 15:47:18.634592 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 21 15:47:18 crc kubenswrapper[4773]: I0121 15:47:18.709358 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 21 15:47:18 crc kubenswrapper[4773]: I0121 15:47:18.714520 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 21 15:47:19 crc kubenswrapper[4773]: I0121 15:47:19.324768 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 21 15:47:19 crc kubenswrapper[4773]: I0121 15:47:19.324813 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 21 15:47:19 crc kubenswrapper[4773]: I0121 15:47:19.359703 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 21 15:47:19 crc kubenswrapper[4773]: I0121 15:47:19.374398 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 21 15:47:19 crc kubenswrapper[4773]: I0121 15:47:19.401770 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e15b4c36-657d-47af-a103-b268a9727430" path="/var/lib/kubelet/pods/e15b4c36-657d-47af-a103-b268a9727430/volumes" Jan 21 15:47:19 crc kubenswrapper[4773]: I0121 15:47:19.601719 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 21 15:47:19 crc kubenswrapper[4773]: I0121 15:47:19.601765 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 21 15:47:19 crc kubenswrapper[4773]: I0121 15:47:19.601777 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 21 15:47:19 crc kubenswrapper[4773]: I0121 15:47:19.601786 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 21 15:47:21 crc kubenswrapper[4773]: I0121 15:47:21.620177 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:47:21 crc kubenswrapper[4773]: I0121 15:47:21.620842 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.074676 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-pdgr9"] Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.076293 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pdgr9" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.078317 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.091587 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pdgr9"] Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.139888 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blb65\" (UniqueName: \"kubernetes.io/projected/cd62e746-7c8e-4a74-a37e-3daa482a53ba-kube-api-access-blb65\") pod \"root-account-create-update-pdgr9\" (UID: \"cd62e746-7c8e-4a74-a37e-3daa482a53ba\") " pod="openstack/root-account-create-update-pdgr9" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.140230 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd62e746-7c8e-4a74-a37e-3daa482a53ba-operator-scripts\") pod \"root-account-create-update-pdgr9\" (UID: \"cd62e746-7c8e-4a74-a37e-3daa482a53ba\") " pod="openstack/root-account-create-update-pdgr9" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.241738 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blb65\" (UniqueName: \"kubernetes.io/projected/cd62e746-7c8e-4a74-a37e-3daa482a53ba-kube-api-access-blb65\") pod \"root-account-create-update-pdgr9\" (UID: \"cd62e746-7c8e-4a74-a37e-3daa482a53ba\") " pod="openstack/root-account-create-update-pdgr9" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.241812 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd62e746-7c8e-4a74-a37e-3daa482a53ba-operator-scripts\") pod \"root-account-create-update-pdgr9\" (UID: \"cd62e746-7c8e-4a74-a37e-3daa482a53ba\") " pod="openstack/root-account-create-update-pdgr9" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.242606 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd62e746-7c8e-4a74-a37e-3daa482a53ba-operator-scripts\") pod \"root-account-create-update-pdgr9\" (UID: \"cd62e746-7c8e-4a74-a37e-3daa482a53ba\") " pod="openstack/root-account-create-update-pdgr9" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.267359 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blb65\" (UniqueName: \"kubernetes.io/projected/cd62e746-7c8e-4a74-a37e-3daa482a53ba-kube-api-access-blb65\") pod \"root-account-create-update-pdgr9\" (UID: \"cd62e746-7c8e-4a74-a37e-3daa482a53ba\") " pod="openstack/root-account-create-update-pdgr9" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.404425 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pdgr9" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.642657 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.642769 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.650679 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.656717 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.656826 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:47:23 crc kubenswrapper[4773]: I0121 15:47:23.669349 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 21 15:47:24 crc kubenswrapper[4773]: I0121 15:47:24.362881 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:47:24 crc kubenswrapper[4773]: I0121 15:47:24.457493 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rw9jg"] Jan 21 15:47:24 crc kubenswrapper[4773]: I0121 15:47:24.457745 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" podUID="819f84c9-b51d-4f3c-81c2-c39af2110563" containerName="dnsmasq-dns" containerID="cri-o://8d344507933e713208a41949c0305bb539dcf31453ebaf04d167b14b07955173" gracePeriod=10 Jan 21 15:47:25 crc kubenswrapper[4773]: I0121 15:47:25.205913 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:47:25 crc kubenswrapper[4773]: I0121 15:47:25.206260 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:47:25 crc kubenswrapper[4773]: I0121 15:47:25.206311 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:47:25 crc kubenswrapper[4773]: I0121 15:47:25.207098 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"056b8b391d1fca843084a7f1dcee0b88446478caa5b2f33055adc27b73ac99d3"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:47:25 crc kubenswrapper[4773]: I0121 15:47:25.207159 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://056b8b391d1fca843084a7f1dcee0b88446478caa5b2f33055adc27b73ac99d3" gracePeriod=600 Jan 21 15:47:25 crc kubenswrapper[4773]: I0121 15:47:25.642075 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" podUID="819f84c9-b51d-4f3c-81c2-c39af2110563" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: connect: connection refused" Jan 21 15:47:25 crc kubenswrapper[4773]: I0121 15:47:25.699918 4773 generic.go:334] "Generic (PLEG): container finished" podID="819f84c9-b51d-4f3c-81c2-c39af2110563" containerID="8d344507933e713208a41949c0305bb539dcf31453ebaf04d167b14b07955173" exitCode=0 Jan 21 15:47:25 crc kubenswrapper[4773]: I0121 15:47:25.700017 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" event={"ID":"819f84c9-b51d-4f3c-81c2-c39af2110563","Type":"ContainerDied","Data":"8d344507933e713208a41949c0305bb539dcf31453ebaf04d167b14b07955173"} Jan 21 15:47:25 crc kubenswrapper[4773]: I0121 15:47:25.717328 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="056b8b391d1fca843084a7f1dcee0b88446478caa5b2f33055adc27b73ac99d3" exitCode=0 Jan 21 15:47:25 crc kubenswrapper[4773]: I0121 15:47:25.717375 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"056b8b391d1fca843084a7f1dcee0b88446478caa5b2f33055adc27b73ac99d3"} Jan 21 15:47:25 crc kubenswrapper[4773]: I0121 15:47:25.717407 4773 scope.go:117] "RemoveContainer" containerID="d0aa21e0cf3e6fccf1e5cd944cd86f7a3dd434dbe323f714f139c45999c5ca44" Jan 21 15:47:26 crc kubenswrapper[4773]: I0121 15:47:26.428547 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:26 crc kubenswrapper[4773]: I0121 15:47:26.498384 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:28 crc kubenswrapper[4773]: E0121 15:47:28.022594 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Jan 21 15:47:28 crc kubenswrapper[4773]: E0121 15:47:28.023089 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lfxsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(51eaf63a-c7b4-47eb-8357-e746bd703b64): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 15:47:28 crc kubenswrapper[4773]: E0121 15:47:28.024515 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.579353 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.661501 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-nb\") pod \"819f84c9-b51d-4f3c-81c2-c39af2110563\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.661624 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-config\") pod \"819f84c9-b51d-4f3c-81c2-c39af2110563\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.661641 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-sb\") pod \"819f84c9-b51d-4f3c-81c2-c39af2110563\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.661679 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-svc\") pod \"819f84c9-b51d-4f3c-81c2-c39af2110563\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.661791 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lblt\" (UniqueName: \"kubernetes.io/projected/819f84c9-b51d-4f3c-81c2-c39af2110563-kube-api-access-4lblt\") pod \"819f84c9-b51d-4f3c-81c2-c39af2110563\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.661863 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-swift-storage-0\") pod \"819f84c9-b51d-4f3c-81c2-c39af2110563\" (UID: \"819f84c9-b51d-4f3c-81c2-c39af2110563\") " Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.677595 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/819f84c9-b51d-4f3c-81c2-c39af2110563-kube-api-access-4lblt" (OuterVolumeSpecName: "kube-api-access-4lblt") pod "819f84c9-b51d-4f3c-81c2-c39af2110563" (UID: "819f84c9-b51d-4f3c-81c2-c39af2110563"). InnerVolumeSpecName "kube-api-access-4lblt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.752977 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pdgr9"] Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.758645 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.758669 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rw9jg" event={"ID":"819f84c9-b51d-4f3c-81c2-c39af2110563","Type":"ContainerDied","Data":"e44941720147ed4b6ea07e2b4646fb1874f44b15c568973f6eabaac68397c921"} Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.758768 4773 scope.go:117] "RemoveContainer" containerID="8d344507933e713208a41949c0305bb539dcf31453ebaf04d167b14b07955173" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.774646 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="ceilometer-central-agent" containerID="cri-o://ae21538c3d66deff4e80c8fcb1b7b626228d77fa95d07860087df285515858ce" gracePeriod=30 Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.774867 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="sg-core" containerID="cri-o://9f60cde77726f6306ecedc80c1577069686296b7d6f37e8456fe72da241e3cd5" gracePeriod=30 Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.774953 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="ceilometer-notification-agent" containerID="cri-o://1981941504f849effb65da2d5b25a3c47a6a97bbb7d5abeba9d270e23e5816ab" gracePeriod=30 Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.788284 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lblt\" (UniqueName: \"kubernetes.io/projected/819f84c9-b51d-4f3c-81c2-c39af2110563-kube-api-access-4lblt\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.896386 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "819f84c9-b51d-4f3c-81c2-c39af2110563" (UID: "819f84c9-b51d-4f3c-81c2-c39af2110563"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.902417 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "819f84c9-b51d-4f3c-81c2-c39af2110563" (UID: "819f84c9-b51d-4f3c-81c2-c39af2110563"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.907393 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "819f84c9-b51d-4f3c-81c2-c39af2110563" (UID: "819f84c9-b51d-4f3c-81c2-c39af2110563"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.910166 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "819f84c9-b51d-4f3c-81c2-c39af2110563" (UID: "819f84c9-b51d-4f3c-81c2-c39af2110563"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.922804 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-config" (OuterVolumeSpecName: "config") pod "819f84c9-b51d-4f3c-81c2-c39af2110563" (UID: "819f84c9-b51d-4f3c-81c2-c39af2110563"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.963645 4773 scope.go:117] "RemoveContainer" containerID="8a441b89268ead55e85d6f0400d6038b9a06a10e3aaa5320ff5beb438f5ae18f" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.995989 4773 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.996196 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.996304 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.996401 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:28 crc kubenswrapper[4773]: I0121 15:47:28.996500 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/819f84c9-b51d-4f3c-81c2-c39af2110563-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.094590 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rw9jg"] Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.106062 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rw9jg"] Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.405572 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="819f84c9-b51d-4f3c-81c2-c39af2110563" path="/var/lib/kubelet/pods/819f84c9-b51d-4f3c-81c2-c39af2110563/volumes" Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.823386 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pdgr9" event={"ID":"cd62e746-7c8e-4a74-a37e-3daa482a53ba","Type":"ContainerStarted","Data":"06a517f5eb8b0202913474784c903634052c39116f05e4f7a356add716a82ac0"} Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.823689 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pdgr9" event={"ID":"cd62e746-7c8e-4a74-a37e-3daa482a53ba","Type":"ContainerStarted","Data":"78966466cade9e2cb07c4f71e31bc5319cb30e02eb81a2172f99eee210314960"} Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.854605 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" event={"ID":"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d","Type":"ContainerStarted","Data":"cf3034b87e6aec3769a461bb61d5366a8106db79910afdf4a44958730dbe8137"} Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.870206 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-pdgr9" podStartSLOduration=6.870179589 podStartE2EDuration="6.870179589s" podCreationTimestamp="2026-01-21 15:47:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:29.859708357 +0000 UTC m=+1414.784197979" watchObservedRunningTime="2026-01-21 15:47:29.870179589 +0000 UTC m=+1414.794669211" Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.882290 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2"} Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.891515 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" podStartSLOduration=4.088466687 podStartE2EDuration="16.891492632s" podCreationTimestamp="2026-01-21 15:47:13 +0000 UTC" firstStartedPulling="2026-01-21 15:47:15.202766484 +0000 UTC m=+1400.127256106" lastFinishedPulling="2026-01-21 15:47:28.005792429 +0000 UTC m=+1412.930282051" observedRunningTime="2026-01-21 15:47:29.887125244 +0000 UTC m=+1414.811614866" watchObservedRunningTime="2026-01-21 15:47:29.891492632 +0000 UTC m=+1414.815982254" Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.900367 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cf4556c4-hwkr9" event={"ID":"8e7d6f73-a63d-40a4-acda-12edb288ec53","Type":"ContainerStarted","Data":"84978560b6a9122e095d7db974d6ff24b4fb44a0c6f069becfa8df3e621fa7d0"} Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.901805 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.901860 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.922657 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-794b46c66c-dj5lp" event={"ID":"e7d5b884-1964-4585-a6b4-bd7813ee52c8","Type":"ContainerStarted","Data":"0041c0604d276efd04609f56705bd028bae89fb33797833676f494b4ea9c9740"} Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.950004 4773 generic.go:334] "Generic (PLEG): container finished" podID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerID="9f60cde77726f6306ecedc80c1577069686296b7d6f37e8456fe72da241e3cd5" exitCode=2 Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.950043 4773 generic.go:334] "Generic (PLEG): container finished" podID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerID="1981941504f849effb65da2d5b25a3c47a6a97bbb7d5abeba9d270e23e5816ab" exitCode=0 Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.950055 4773 generic.go:334] "Generic (PLEG): container finished" podID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerID="ae21538c3d66deff4e80c8fcb1b7b626228d77fa95d07860087df285515858ce" exitCode=0 Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.950087 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51eaf63a-c7b4-47eb-8357-e746bd703b64","Type":"ContainerDied","Data":"9f60cde77726f6306ecedc80c1577069686296b7d6f37e8456fe72da241e3cd5"} Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.950118 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51eaf63a-c7b4-47eb-8357-e746bd703b64","Type":"ContainerDied","Data":"1981941504f849effb65da2d5b25a3c47a6a97bbb7d5abeba9d270e23e5816ab"} Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.950134 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51eaf63a-c7b4-47eb-8357-e746bd703b64","Type":"ContainerDied","Data":"ae21538c3d66deff4e80c8fcb1b7b626228d77fa95d07860087df285515858ce"} Jan 21 15:47:29 crc kubenswrapper[4773]: I0121 15:47:29.982194 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5cf4556c4-hwkr9" podStartSLOduration=13.98216784 podStartE2EDuration="13.98216784s" podCreationTimestamp="2026-01-21 15:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:29.97211328 +0000 UTC m=+1414.896602912" watchObservedRunningTime="2026-01-21 15:47:29.98216784 +0000 UTC m=+1414.906657462" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.001225 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-794b46c66c-dj5lp" podStartSLOduration=4.177874801 podStartE2EDuration="17.001203332s" podCreationTimestamp="2026-01-21 15:47:13 +0000 UTC" firstStartedPulling="2026-01-21 15:47:15.18252964 +0000 UTC m=+1400.107019262" lastFinishedPulling="2026-01-21 15:47:28.005858171 +0000 UTC m=+1412.930347793" observedRunningTime="2026-01-21 15:47:29.999995979 +0000 UTC m=+1414.924485591" watchObservedRunningTime="2026-01-21 15:47:30.001203332 +0000 UTC m=+1414.925692964" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.477941 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.634360 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-run-httpd\") pod \"51eaf63a-c7b4-47eb-8357-e746bd703b64\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.634429 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-combined-ca-bundle\") pod \"51eaf63a-c7b4-47eb-8357-e746bd703b64\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.634509 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfxsm\" (UniqueName: \"kubernetes.io/projected/51eaf63a-c7b4-47eb-8357-e746bd703b64-kube-api-access-lfxsm\") pod \"51eaf63a-c7b4-47eb-8357-e746bd703b64\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.634591 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-sg-core-conf-yaml\") pod \"51eaf63a-c7b4-47eb-8357-e746bd703b64\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.634706 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-config-data\") pod \"51eaf63a-c7b4-47eb-8357-e746bd703b64\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.634766 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51eaf63a-c7b4-47eb-8357-e746bd703b64" (UID: "51eaf63a-c7b4-47eb-8357-e746bd703b64"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.634797 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-scripts\") pod \"51eaf63a-c7b4-47eb-8357-e746bd703b64\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.634842 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-log-httpd\") pod \"51eaf63a-c7b4-47eb-8357-e746bd703b64\" (UID: \"51eaf63a-c7b4-47eb-8357-e746bd703b64\") " Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.635248 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51eaf63a-c7b4-47eb-8357-e746bd703b64" (UID: "51eaf63a-c7b4-47eb-8357-e746bd703b64"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.635732 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.635760 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51eaf63a-c7b4-47eb-8357-e746bd703b64-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.643889 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51eaf63a-c7b4-47eb-8357-e746bd703b64-kube-api-access-lfxsm" (OuterVolumeSpecName: "kube-api-access-lfxsm") pod "51eaf63a-c7b4-47eb-8357-e746bd703b64" (UID: "51eaf63a-c7b4-47eb-8357-e746bd703b64"). InnerVolumeSpecName "kube-api-access-lfxsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.659759 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-scripts" (OuterVolumeSpecName: "scripts") pod "51eaf63a-c7b4-47eb-8357-e746bd703b64" (UID: "51eaf63a-c7b4-47eb-8357-e746bd703b64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.734911 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51eaf63a-c7b4-47eb-8357-e746bd703b64" (UID: "51eaf63a-c7b4-47eb-8357-e746bd703b64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.737198 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.737226 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.737237 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfxsm\" (UniqueName: \"kubernetes.io/projected/51eaf63a-c7b4-47eb-8357-e746bd703b64-kube-api-access-lfxsm\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.744081 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-config-data" (OuterVolumeSpecName: "config-data") pod "51eaf63a-c7b4-47eb-8357-e746bd703b64" (UID: "51eaf63a-c7b4-47eb-8357-e746bd703b64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.750869 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51eaf63a-c7b4-47eb-8357-e746bd703b64" (UID: "51eaf63a-c7b4-47eb-8357-e746bd703b64"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.839948 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.839995 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51eaf63a-c7b4-47eb-8357-e746bd703b64-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.962873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d45c98f8b-b4vzj" event={"ID":"36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d","Type":"ContainerStarted","Data":"f078d3f9071fe440d6655f5b6b8f8fe09e70fe916407f7af6cee1df2eabe780c"} Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.965172 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-794b46c66c-dj5lp" event={"ID":"e7d5b884-1964-4585-a6b4-bd7813ee52c8","Type":"ContainerStarted","Data":"f9bf80b7e6aedf6cb7c9123d6406c33330979424286da5817bc26f2438053a90"} Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.967657 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51eaf63a-c7b4-47eb-8357-e746bd703b64","Type":"ContainerDied","Data":"a0d0ee75d0b94890be51f8aab3c9a286fc0c910dfc48815c76051383495fa26d"} Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.967727 4773 scope.go:117] "RemoveContainer" containerID="9f60cde77726f6306ecedc80c1577069686296b7d6f37e8456fe72da241e3cd5" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.967840 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.972457 4773 generic.go:334] "Generic (PLEG): container finished" podID="cd62e746-7c8e-4a74-a37e-3daa482a53ba" containerID="06a517f5eb8b0202913474784c903634052c39116f05e4f7a356add716a82ac0" exitCode=0 Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.972742 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pdgr9" event={"ID":"cd62e746-7c8e-4a74-a37e-3daa482a53ba","Type":"ContainerDied","Data":"06a517f5eb8b0202913474784c903634052c39116f05e4f7a356add716a82ac0"} Jan 21 15:47:30 crc kubenswrapper[4773]: I0121 15:47:30.998105 4773 scope.go:117] "RemoveContainer" containerID="1981941504f849effb65da2d5b25a3c47a6a97bbb7d5abeba9d270e23e5816ab" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.022104 4773 scope.go:117] "RemoveContainer" containerID="ae21538c3d66deff4e80c8fcb1b7b626228d77fa95d07860087df285515858ce" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.052032 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.064129 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.087793 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:47:31 crc kubenswrapper[4773]: E0121 15:47:31.088253 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819f84c9-b51d-4f3c-81c2-c39af2110563" containerName="init" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.088274 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="819f84c9-b51d-4f3c-81c2-c39af2110563" containerName="init" Jan 21 15:47:31 crc kubenswrapper[4773]: E0121 15:47:31.088292 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="ceilometer-notification-agent" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.088301 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="ceilometer-notification-agent" Jan 21 15:47:31 crc kubenswrapper[4773]: E0121 15:47:31.088318 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="ceilometer-central-agent" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.088325 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="ceilometer-central-agent" Jan 21 15:47:31 crc kubenswrapper[4773]: E0121 15:47:31.088350 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="sg-core" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.088357 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="sg-core" Jan 21 15:47:31 crc kubenswrapper[4773]: E0121 15:47:31.088378 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819f84c9-b51d-4f3c-81c2-c39af2110563" containerName="dnsmasq-dns" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.088385 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="819f84c9-b51d-4f3c-81c2-c39af2110563" containerName="dnsmasq-dns" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.088570 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="ceilometer-notification-agent" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.088591 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="sg-core" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.088608 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="819f84c9-b51d-4f3c-81c2-c39af2110563" containerName="dnsmasq-dns" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.088617 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" containerName="ceilometer-central-agent" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.090413 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.095315 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.095578 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.129718 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.170246 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-scripts\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.170411 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-config-data\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.171233 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.171308 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-run-httpd\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.171386 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-log-httpd\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.171417 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.171558 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckxlx\" (UniqueName: \"kubernetes.io/projected/60bdf67b-df3f-4544-8c2f-08d0b2395149-kube-api-access-ckxlx\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.273095 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-config-data\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.273208 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.273244 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-run-httpd\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.273281 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-log-httpd\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.273331 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.273712 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-run-httpd\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.273773 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-log-httpd\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.273841 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckxlx\" (UniqueName: \"kubernetes.io/projected/60bdf67b-df3f-4544-8c2f-08d0b2395149-kube-api-access-ckxlx\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.274273 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-scripts\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.280793 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-scripts\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.281398 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-config-data\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.284378 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.290255 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.295545 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckxlx\" (UniqueName: \"kubernetes.io/projected/60bdf67b-df3f-4544-8c2f-08d0b2395149-kube-api-access-ckxlx\") pod \"ceilometer-0\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.410744 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51eaf63a-c7b4-47eb-8357-e746bd703b64" path="/var/lib/kubelet/pods/51eaf63a-c7b4-47eb-8357-e746bd703b64/volumes" Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.465100 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:47:31 crc kubenswrapper[4773]: W0121 15:47:31.975307 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60bdf67b_df3f_4544_8c2f_08d0b2395149.slice/crio-e88d50ff186bbce0931bb143c7cfafbfed7e1c991a8fb05148c62e84de784fa9 WatchSource:0}: Error finding container e88d50ff186bbce0931bb143c7cfafbfed7e1c991a8fb05148c62e84de784fa9: Status 404 returned error can't find the container with id e88d50ff186bbce0931bb143c7cfafbfed7e1c991a8fb05148c62e84de784fa9 Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.978819 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:47:31 crc kubenswrapper[4773]: I0121 15:47:31.990758 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:47:32 crc kubenswrapper[4773]: I0121 15:47:32.126544 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:32 crc kubenswrapper[4773]: I0121 15:47:32.510807 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pdgr9" Jan 21 15:47:32 crc kubenswrapper[4773]: I0121 15:47:32.516744 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd62e746-7c8e-4a74-a37e-3daa482a53ba-operator-scripts\") pod \"cd62e746-7c8e-4a74-a37e-3daa482a53ba\" (UID: \"cd62e746-7c8e-4a74-a37e-3daa482a53ba\") " Jan 21 15:47:32 crc kubenswrapper[4773]: I0121 15:47:32.516808 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blb65\" (UniqueName: \"kubernetes.io/projected/cd62e746-7c8e-4a74-a37e-3daa482a53ba-kube-api-access-blb65\") pod \"cd62e746-7c8e-4a74-a37e-3daa482a53ba\" (UID: \"cd62e746-7c8e-4a74-a37e-3daa482a53ba\") " Jan 21 15:47:32 crc kubenswrapper[4773]: I0121 15:47:32.517990 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd62e746-7c8e-4a74-a37e-3daa482a53ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd62e746-7c8e-4a74-a37e-3daa482a53ba" (UID: "cd62e746-7c8e-4a74-a37e-3daa482a53ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:32 crc kubenswrapper[4773]: I0121 15:47:32.526198 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd62e746-7c8e-4a74-a37e-3daa482a53ba-kube-api-access-blb65" (OuterVolumeSpecName: "kube-api-access-blb65") pod "cd62e746-7c8e-4a74-a37e-3daa482a53ba" (UID: "cd62e746-7c8e-4a74-a37e-3daa482a53ba"). InnerVolumeSpecName "kube-api-access-blb65". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:32 crc kubenswrapper[4773]: I0121 15:47:32.618476 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd62e746-7c8e-4a74-a37e-3daa482a53ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:32 crc kubenswrapper[4773]: I0121 15:47:32.618549 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blb65\" (UniqueName: \"kubernetes.io/projected/cd62e746-7c8e-4a74-a37e-3daa482a53ba-kube-api-access-blb65\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:33 crc kubenswrapper[4773]: I0121 15:47:33.001386 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60bdf67b-df3f-4544-8c2f-08d0b2395149","Type":"ContainerStarted","Data":"e88d50ff186bbce0931bb143c7cfafbfed7e1c991a8fb05148c62e84de784fa9"} Jan 21 15:47:33 crc kubenswrapper[4773]: I0121 15:47:33.003319 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pdgr9" Jan 21 15:47:33 crc kubenswrapper[4773]: I0121 15:47:33.003316 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pdgr9" event={"ID":"cd62e746-7c8e-4a74-a37e-3daa482a53ba","Type":"ContainerDied","Data":"78966466cade9e2cb07c4f71e31bc5319cb30e02eb81a2172f99eee210314960"} Jan 21 15:47:33 crc kubenswrapper[4773]: I0121 15:47:33.003364 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78966466cade9e2cb07c4f71e31bc5319cb30e02eb81a2172f99eee210314960" Jan 21 15:47:37 crc kubenswrapper[4773]: I0121 15:47:37.129976 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cf4556c4-hwkr9" podUID="8e7d6f73-a63d-40a4-acda-12edb288ec53" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.182:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:47:38 crc kubenswrapper[4773]: I0121 15:47:38.051798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60bdf67b-df3f-4544-8c2f-08d0b2395149","Type":"ContainerStarted","Data":"0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631"} Jan 21 15:47:39 crc kubenswrapper[4773]: I0121 15:47:39.015466 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:39 crc kubenswrapper[4773]: I0121 15:47:39.133954 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60bdf67b-df3f-4544-8c2f-08d0b2395149","Type":"ContainerStarted","Data":"cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9"} Jan 21 15:47:39 crc kubenswrapper[4773]: I0121 15:47:39.854906 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5cf4556c4-hwkr9" Jan 21 15:47:39 crc kubenswrapper[4773]: I0121 15:47:39.970216 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-668bc45dd4-pd2gr"] Jan 21 15:47:39 crc kubenswrapper[4773]: I0121 15:47:39.970524 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-668bc45dd4-pd2gr" podUID="30c0bedd-4d99-4e6e-9276-7dcacb65b18f" containerName="barbican-api-log" containerID="cri-o://cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac" gracePeriod=30 Jan 21 15:47:39 crc kubenswrapper[4773]: I0121 15:47:39.970675 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-668bc45dd4-pd2gr" podUID="30c0bedd-4d99-4e6e-9276-7dcacb65b18f" containerName="barbican-api" containerID="cri-o://fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8" gracePeriod=30 Jan 21 15:47:40 crc kubenswrapper[4773]: I0121 15:47:40.164367 4773 generic.go:334] "Generic (PLEG): container finished" podID="30c0bedd-4d99-4e6e-9276-7dcacb65b18f" containerID="cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac" exitCode=143 Jan 21 15:47:40 crc kubenswrapper[4773]: I0121 15:47:40.164798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-668bc45dd4-pd2gr" event={"ID":"30c0bedd-4d99-4e6e-9276-7dcacb65b18f","Type":"ContainerDied","Data":"cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac"} Jan 21 15:47:40 crc kubenswrapper[4773]: I0121 15:47:40.176084 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60bdf67b-df3f-4544-8c2f-08d0b2395149","Type":"ContainerStarted","Data":"019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1"} Jan 21 15:47:41 crc kubenswrapper[4773]: I0121 15:47:41.116784 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:41 crc kubenswrapper[4773]: I0121 15:47:41.325501 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5cf6c78dd-68gm6" Jan 21 15:47:41 crc kubenswrapper[4773]: I0121 15:47:41.670268 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6cb88dccdd-v7jgc" Jan 21 15:47:42 crc kubenswrapper[4773]: I0121 15:47:42.199398 4773 generic.go:334] "Generic (PLEG): container finished" podID="de6a84b1-1846-4dd0-be7f-47a8872227ff" containerID="776ffd65e5f9d25fb0aabd5932a286dc6c16b7cc45cc3b53741f73dc28f94960" exitCode=0 Jan 21 15:47:42 crc kubenswrapper[4773]: I0121 15:47:42.199485 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-77kxr" event={"ID":"de6a84b1-1846-4dd0-be7f-47a8872227ff","Type":"ContainerDied","Data":"776ffd65e5f9d25fb0aabd5932a286dc6c16b7cc45cc3b53741f73dc28f94960"} Jan 21 15:47:42 crc kubenswrapper[4773]: I0121 15:47:42.202801 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60bdf67b-df3f-4544-8c2f-08d0b2395149","Type":"ContainerStarted","Data":"e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f"} Jan 21 15:47:42 crc kubenswrapper[4773]: I0121 15:47:42.203010 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 15:47:42 crc kubenswrapper[4773]: I0121 15:47:42.246991 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.242730747 podStartE2EDuration="11.246968581s" podCreationTimestamp="2026-01-21 15:47:31 +0000 UTC" firstStartedPulling="2026-01-21 15:47:31.978906999 +0000 UTC m=+1416.903396631" lastFinishedPulling="2026-01-21 15:47:40.983144843 +0000 UTC m=+1425.907634465" observedRunningTime="2026-01-21 15:47:42.239800618 +0000 UTC m=+1427.164290250" watchObservedRunningTime="2026-01-21 15:47:42.246968581 +0000 UTC m=+1427.171458203" Jan 21 15:47:42 crc kubenswrapper[4773]: I0121 15:47:42.253780 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-746866d6b5-jbp68" Jan 21 15:47:42 crc kubenswrapper[4773]: I0121 15:47:42.346073 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cd9678444-nmk9w"] Jan 21 15:47:42 crc kubenswrapper[4773]: I0121 15:47:42.346296 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cd9678444-nmk9w" podUID="133a51b0-d4c1-4515-b260-143df28703df" containerName="neutron-api" containerID="cri-o://4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a" gracePeriod=30 Jan 21 15:47:42 crc kubenswrapper[4773]: I0121 15:47:42.346794 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cd9678444-nmk9w" podUID="133a51b0-d4c1-4515-b260-143df28703df" containerName="neutron-httpd" containerID="cri-o://33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a" gracePeriod=30 Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.215215 4773 generic.go:334] "Generic (PLEG): container finished" podID="5c0beb28-481c-4507-94c2-d644e4faf5ab" containerID="4db387886c94d9295b7d0446bd4a1606255259c595aabc38c5d6e12d844dc3b8" exitCode=0 Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.215298 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-znjg2" event={"ID":"5c0beb28-481c-4507-94c2-d644e4faf5ab","Type":"ContainerDied","Data":"4db387886c94d9295b7d0446bd4a1606255259c595aabc38c5d6e12d844dc3b8"} Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.217836 4773 generic.go:334] "Generic (PLEG): container finished" podID="133a51b0-d4c1-4515-b260-143df28703df" containerID="33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a" exitCode=0 Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.217897 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd9678444-nmk9w" event={"ID":"133a51b0-d4c1-4515-b260-143df28703df","Type":"ContainerDied","Data":"33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a"} Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.766763 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.843038 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-config-data\") pod \"de6a84b1-1846-4dd0-be7f-47a8872227ff\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.843310 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-combined-ca-bundle\") pod \"de6a84b1-1846-4dd0-be7f-47a8872227ff\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.843397 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-certs\") pod \"de6a84b1-1846-4dd0-be7f-47a8872227ff\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.843423 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrlkd\" (UniqueName: \"kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-kube-api-access-hrlkd\") pod \"de6a84b1-1846-4dd0-be7f-47a8872227ff\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.843619 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-scripts\") pod \"de6a84b1-1846-4dd0-be7f-47a8872227ff\" (UID: \"de6a84b1-1846-4dd0-be7f-47a8872227ff\") " Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.849227 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-kube-api-access-hrlkd" (OuterVolumeSpecName: "kube-api-access-hrlkd") pod "de6a84b1-1846-4dd0-be7f-47a8872227ff" (UID: "de6a84b1-1846-4dd0-be7f-47a8872227ff"). InnerVolumeSpecName "kube-api-access-hrlkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.850522 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-scripts" (OuterVolumeSpecName: "scripts") pod "de6a84b1-1846-4dd0-be7f-47a8872227ff" (UID: "de6a84b1-1846-4dd0-be7f-47a8872227ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.850769 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-certs" (OuterVolumeSpecName: "certs") pod "de6a84b1-1846-4dd0-be7f-47a8872227ff" (UID: "de6a84b1-1846-4dd0-be7f-47a8872227ff"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.883254 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de6a84b1-1846-4dd0-be7f-47a8872227ff" (UID: "de6a84b1-1846-4dd0-be7f-47a8872227ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.912852 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-config-data" (OuterVolumeSpecName: "config-data") pod "de6a84b1-1846-4dd0-be7f-47a8872227ff" (UID: "de6a84b1-1846-4dd0-be7f-47a8872227ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.945867 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.945907 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.945922 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.945933 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrlkd\" (UniqueName: \"kubernetes.io/projected/de6a84b1-1846-4dd0-be7f-47a8872227ff-kube-api-access-hrlkd\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.945943 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6a84b1-1846-4dd0-be7f-47a8872227ff-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:43 crc kubenswrapper[4773]: I0121 15:47:43.965860 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.047575 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-combined-ca-bundle\") pod \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.047707 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data-custom\") pod \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.047767 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-logs\") pod \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.047838 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckr9b\" (UniqueName: \"kubernetes.io/projected/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-kube-api-access-ckr9b\") pod \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.047931 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data\") pod \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\" (UID: \"30c0bedd-4d99-4e6e-9276-7dcacb65b18f\") " Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.048630 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-logs" (OuterVolumeSpecName: "logs") pod "30c0bedd-4d99-4e6e-9276-7dcacb65b18f" (UID: "30c0bedd-4d99-4e6e-9276-7dcacb65b18f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.051432 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "30c0bedd-4d99-4e6e-9276-7dcacb65b18f" (UID: "30c0bedd-4d99-4e6e-9276-7dcacb65b18f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.051613 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-kube-api-access-ckr9b" (OuterVolumeSpecName: "kube-api-access-ckr9b") pod "30c0bedd-4d99-4e6e-9276-7dcacb65b18f" (UID: "30c0bedd-4d99-4e6e-9276-7dcacb65b18f"). InnerVolumeSpecName "kube-api-access-ckr9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.080499 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30c0bedd-4d99-4e6e-9276-7dcacb65b18f" (UID: "30c0bedd-4d99-4e6e-9276-7dcacb65b18f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.102386 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data" (OuterVolumeSpecName: "config-data") pod "30c0bedd-4d99-4e6e-9276-7dcacb65b18f" (UID: "30c0bedd-4d99-4e6e-9276-7dcacb65b18f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.150442 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.150484 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.150497 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.150509 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckr9b\" (UniqueName: \"kubernetes.io/projected/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-kube-api-access-ckr9b\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.150523 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c0bedd-4d99-4e6e-9276-7dcacb65b18f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.236873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-77kxr" event={"ID":"de6a84b1-1846-4dd0-be7f-47a8872227ff","Type":"ContainerDied","Data":"b3cfdc61907b073f19e71c6eab39ed7e0fd17bc6884b3741d397ed7c6fc15d19"} Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.236987 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3cfdc61907b073f19e71c6eab39ed7e0fd17bc6884b3741d397ed7c6fc15d19" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.237058 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-77kxr" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.258082 4773 generic.go:334] "Generic (PLEG): container finished" podID="30c0bedd-4d99-4e6e-9276-7dcacb65b18f" containerID="fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8" exitCode=0 Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.258282 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-668bc45dd4-pd2gr" event={"ID":"30c0bedd-4d99-4e6e-9276-7dcacb65b18f","Type":"ContainerDied","Data":"fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8"} Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.258324 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-668bc45dd4-pd2gr" event={"ID":"30c0bedd-4d99-4e6e-9276-7dcacb65b18f","Type":"ContainerDied","Data":"e13007318f90a0c91eed3026a5b6da32c72b83c06904d6573bbec7943124769e"} Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.258345 4773 scope.go:117] "RemoveContainer" containerID="fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.258339 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-668bc45dd4-pd2gr" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.321192 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-668bc45dd4-pd2gr"] Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.329511 4773 scope.go:117] "RemoveContainer" containerID="cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.335068 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-668bc45dd4-pd2gr"] Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.436865 4773 scope.go:117] "RemoveContainer" containerID="fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8" Jan 21 15:47:44 crc kubenswrapper[4773]: E0121 15:47:44.440990 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8\": container with ID starting with fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8 not found: ID does not exist" containerID="fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.441043 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8"} err="failed to get container status \"fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8\": rpc error: code = NotFound desc = could not find container \"fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8\": container with ID starting with fce34d75984797aa26c48a26ed1adfc3ceeaf856383f3bef34daf8b3be5368c8 not found: ID does not exist" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.441072 4773 scope.go:117] "RemoveContainer" containerID="cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac" Jan 21 15:47:44 crc kubenswrapper[4773]: E0121 15:47:44.449870 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac\": container with ID starting with cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac not found: ID does not exist" containerID="cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.449927 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac"} err="failed to get container status \"cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac\": rpc error: code = NotFound desc = could not find container \"cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac\": container with ID starting with cb4e23273094c54a8d2e04702c9398c42829a52bfeff64303141fdb1442909ac not found: ID does not exist" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.489757 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-9pxhf"] Jan 21 15:47:44 crc kubenswrapper[4773]: E0121 15:47:44.490437 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6a84b1-1846-4dd0-be7f-47a8872227ff" containerName="cloudkitty-db-sync" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.490448 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6a84b1-1846-4dd0-be7f-47a8872227ff" containerName="cloudkitty-db-sync" Jan 21 15:47:44 crc kubenswrapper[4773]: E0121 15:47:44.490470 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c0bedd-4d99-4e6e-9276-7dcacb65b18f" containerName="barbican-api-log" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.490476 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c0bedd-4d99-4e6e-9276-7dcacb65b18f" containerName="barbican-api-log" Jan 21 15:47:44 crc kubenswrapper[4773]: E0121 15:47:44.490485 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd62e746-7c8e-4a74-a37e-3daa482a53ba" containerName="mariadb-account-create-update" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.490491 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd62e746-7c8e-4a74-a37e-3daa482a53ba" containerName="mariadb-account-create-update" Jan 21 15:47:44 crc kubenswrapper[4773]: E0121 15:47:44.490511 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c0bedd-4d99-4e6e-9276-7dcacb65b18f" containerName="barbican-api" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.490516 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c0bedd-4d99-4e6e-9276-7dcacb65b18f" containerName="barbican-api" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.490686 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c0bedd-4d99-4e6e-9276-7dcacb65b18f" containerName="barbican-api" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.490716 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c0bedd-4d99-4e6e-9276-7dcacb65b18f" containerName="barbican-api-log" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.490743 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd62e746-7c8e-4a74-a37e-3daa482a53ba" containerName="mariadb-account-create-update" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.490751 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6a84b1-1846-4dd0-be7f-47a8872227ff" containerName="cloudkitty-db-sync" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.491410 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.495911 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.496108 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.496206 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-x9snq" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.496314 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.504075 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.514388 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-9pxhf"] Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.599516 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpssn\" (UniqueName: \"kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-kube-api-access-rpssn\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.599585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-scripts\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.599663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-certs\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.599726 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-config-data\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.599765 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-combined-ca-bundle\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.701407 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-combined-ca-bundle\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.701593 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpssn\" (UniqueName: \"kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-kube-api-access-rpssn\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.701634 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-scripts\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.701688 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-certs\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.701747 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-config-data\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.714838 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-combined-ca-bundle\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.715092 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-scripts\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.715104 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-config-data\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.715297 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-certs\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.739765 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpssn\" (UniqueName: \"kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-kube-api-access-rpssn\") pod \"cloudkitty-storageinit-9pxhf\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.939751 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-znjg2" Jan 21 15:47:44 crc kubenswrapper[4773]: I0121 15:47:44.954158 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.108984 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-combined-ca-bundle\") pod \"5c0beb28-481c-4507-94c2-d644e4faf5ab\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.109056 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mjff\" (UniqueName: \"kubernetes.io/projected/5c0beb28-481c-4507-94c2-d644e4faf5ab-kube-api-access-4mjff\") pod \"5c0beb28-481c-4507-94c2-d644e4faf5ab\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.109089 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-db-sync-config-data\") pod \"5c0beb28-481c-4507-94c2-d644e4faf5ab\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.109492 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-scripts\") pod \"5c0beb28-481c-4507-94c2-d644e4faf5ab\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.109530 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c0beb28-481c-4507-94c2-d644e4faf5ab-etc-machine-id\") pod \"5c0beb28-481c-4507-94c2-d644e4faf5ab\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.109622 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-config-data\") pod \"5c0beb28-481c-4507-94c2-d644e4faf5ab\" (UID: \"5c0beb28-481c-4507-94c2-d644e4faf5ab\") " Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.112386 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c0beb28-481c-4507-94c2-d644e4faf5ab-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5c0beb28-481c-4507-94c2-d644e4faf5ab" (UID: "5c0beb28-481c-4507-94c2-d644e4faf5ab"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.127379 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5c0beb28-481c-4507-94c2-d644e4faf5ab" (UID: "5c0beb28-481c-4507-94c2-d644e4faf5ab"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.127453 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-scripts" (OuterVolumeSpecName: "scripts") pod "5c0beb28-481c-4507-94c2-d644e4faf5ab" (UID: "5c0beb28-481c-4507-94c2-d644e4faf5ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.131853 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0beb28-481c-4507-94c2-d644e4faf5ab-kube-api-access-4mjff" (OuterVolumeSpecName: "kube-api-access-4mjff") pod "5c0beb28-481c-4507-94c2-d644e4faf5ab" (UID: "5c0beb28-481c-4507-94c2-d644e4faf5ab"). InnerVolumeSpecName "kube-api-access-4mjff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.168951 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c0beb28-481c-4507-94c2-d644e4faf5ab" (UID: "5c0beb28-481c-4507-94c2-d644e4faf5ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.193528 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-config-data" (OuterVolumeSpecName: "config-data") pod "5c0beb28-481c-4507-94c2-d644e4faf5ab" (UID: "5c0beb28-481c-4507-94c2-d644e4faf5ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.212267 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.212334 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mjff\" (UniqueName: \"kubernetes.io/projected/5c0beb28-481c-4507-94c2-d644e4faf5ab-kube-api-access-4mjff\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.212346 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.212357 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.212368 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c0beb28-481c-4507-94c2-d644e4faf5ab-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.212376 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0beb28-481c-4507-94c2-d644e4faf5ab-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.271856 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-znjg2" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.272708 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-znjg2" event={"ID":"5c0beb28-481c-4507-94c2-d644e4faf5ab","Type":"ContainerDied","Data":"06a9dd01ae16972006c7663ab8cb0e550476406076303d343aba67dcfe4a7537"} Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.272809 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06a9dd01ae16972006c7663ab8cb0e550476406076303d343aba67dcfe4a7537" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.398122 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c0bedd-4d99-4e6e-9276-7dcacb65b18f" path="/var/lib/kubelet/pods/30c0bedd-4d99-4e6e-9276-7dcacb65b18f/volumes" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.424672 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 21 15:47:45 crc kubenswrapper[4773]: E0121 15:47:45.425199 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0beb28-481c-4507-94c2-d644e4faf5ab" containerName="cinder-db-sync" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.425225 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0beb28-481c-4507-94c2-d644e4faf5ab" containerName="cinder-db-sync" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.425487 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0beb28-481c-4507-94c2-d644e4faf5ab" containerName="cinder-db-sync" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.426203 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.430008 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.430646 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.431391 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6l55q" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.443832 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.519275 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-9pxhf"] Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.534899 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jcvv\" (UniqueName: \"kubernetes.io/projected/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-kube-api-access-4jcvv\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.535004 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-openstack-config\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.535030 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.535180 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.575083 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.594426 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.600869 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.601011 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lp86k" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.601218 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.601378 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.627115 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.639489 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.639642 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jcvv\" (UniqueName: \"kubernetes.io/projected/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-kube-api-access-4jcvv\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.639817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-openstack-config\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.639859 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.651567 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-openstack-config\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.656724 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.678203 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.685152 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jcvv\" (UniqueName: \"kubernetes.io/projected/d34079f2-2d08-4ddc-8d49-a9afaadaba8c-kube-api-access-4jcvv\") pod \"openstackclient\" (UID: \"d34079f2-2d08-4ddc-8d49-a9afaadaba8c\") " pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.736760 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-c2frn"] Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.739082 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.743772 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.743896 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.743922 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n24r\" (UniqueName: \"kubernetes.io/projected/81537761-2fff-47ab-b6bf-d8bc5d14d228-kube-api-access-5n24r\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.744012 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.744111 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-scripts\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.744197 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81537761-2fff-47ab-b6bf-d8bc5d14d228-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.745684 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.786025 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-c2frn"] Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853045 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-scripts\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853188 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853274 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81537761-2fff-47ab-b6bf-d8bc5d14d228-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853348 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-config\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853428 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkbmh\" (UniqueName: \"kubernetes.io/projected/13e848be-010f-4797-b0cc-a5ff046c55c8-kube-api-access-rkbmh\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853501 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853553 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853580 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n24r\" (UniqueName: \"kubernetes.io/projected/81537761-2fff-47ab-b6bf-d8bc5d14d228-kube-api-access-5n24r\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853682 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853869 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853911 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.853960 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.861760 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81537761-2fff-47ab-b6bf-d8bc5d14d228-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.861845 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.863434 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.867848 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.874606 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-scripts\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.875630 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.875650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.897974 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.899913 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n24r\" (UniqueName: \"kubernetes.io/projected/81537761-2fff-47ab-b6bf-d8bc5d14d228-kube-api-access-5n24r\") pod \"cinder-scheduler-0\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " pod="openstack/cinder-scheduler-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.906143 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.956258 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957100 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957181 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957221 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957328 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n5gt\" (UniqueName: \"kubernetes.io/projected/ecb64220-6224-45f9-ad72-881974ee8a05-kube-api-access-4n5gt\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957383 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data-custom\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957473 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957496 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecb64220-6224-45f9-ad72-881974ee8a05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957537 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecb64220-6224-45f9-ad72-881974ee8a05-logs\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957609 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-scripts\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957639 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-config\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.957718 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkbmh\" (UniqueName: \"kubernetes.io/projected/13e848be-010f-4797-b0cc-a5ff046c55c8-kube-api-access-rkbmh\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.959948 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.960444 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.961465 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.961532 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-config\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.962156 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:45 crc kubenswrapper[4773]: I0121 15:47:45.977445 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkbmh\" (UniqueName: \"kubernetes.io/projected/13e848be-010f-4797-b0cc-a5ff046c55c8-kube-api-access-rkbmh\") pod \"dnsmasq-dns-5c9776ccc5-c2frn\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.058854 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.058951 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n5gt\" (UniqueName: \"kubernetes.io/projected/ecb64220-6224-45f9-ad72-881974ee8a05-kube-api-access-4n5gt\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.058986 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data-custom\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.059025 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecb64220-6224-45f9-ad72-881974ee8a05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.059043 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecb64220-6224-45f9-ad72-881974ee8a05-logs\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.059067 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.059093 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-scripts\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.060879 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecb64220-6224-45f9-ad72-881974ee8a05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.061279 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecb64220-6224-45f9-ad72-881974ee8a05-logs\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.067414 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.071950 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.073991 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-scripts\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.077463 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data-custom\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.082423 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n5gt\" (UniqueName: \"kubernetes.io/projected/ecb64220-6224-45f9-ad72-881974ee8a05-kube-api-access-4n5gt\") pod \"cinder-api-0\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.103390 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.140047 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.194948 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.328599 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9pxhf" event={"ID":"acd3ea21-489e-4693-8c06-1ec6af224609","Type":"ContainerStarted","Data":"f2c7b6ef123f16d70ab794a89c465b4d07965eab16e83e431aa8d70489a42d31"} Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.328998 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9pxhf" event={"ID":"acd3ea21-489e-4693-8c06-1ec6af224609","Type":"ContainerStarted","Data":"d1e513f54d757daa4108880ad77e403049d3969464d25790b26edffec515d2d6"} Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.358066 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.360379 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-9pxhf" podStartSLOduration=2.360356783 podStartE2EDuration="2.360356783s" podCreationTimestamp="2026-01-21 15:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:46.345763931 +0000 UTC m=+1431.270253553" watchObservedRunningTime="2026-01-21 15:47:46.360356783 +0000 UTC m=+1431.284846405" Jan 21 15:47:46 crc kubenswrapper[4773]: W0121 15:47:46.363465 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd34079f2_2d08_4ddc_8d49_a9afaadaba8c.slice/crio-f70f7427d47123c7395611c5c4fac101422fd1d3d231f253202e12666ce637aa WatchSource:0}: Error finding container f70f7427d47123c7395611c5c4fac101422fd1d3d231f253202e12666ce637aa: Status 404 returned error can't find the container with id f70f7427d47123c7395611c5c4fac101422fd1d3d231f253202e12666ce637aa Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.791984 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 15:47:46 crc kubenswrapper[4773]: I0121 15:47:46.949600 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-c2frn"] Jan 21 15:47:47 crc kubenswrapper[4773]: I0121 15:47:47.137128 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 15:47:47 crc kubenswrapper[4773]: W0121 15:47:47.198849 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecb64220_6224_45f9_ad72_881974ee8a05.slice/crio-f81be3980f8b5742ed58233733ec8e12a64a20a02b12da9c401b5173aa27fc49 WatchSource:0}: Error finding container f81be3980f8b5742ed58233733ec8e12a64a20a02b12da9c401b5173aa27fc49: Status 404 returned error can't find the container with id f81be3980f8b5742ed58233733ec8e12a64a20a02b12da9c401b5173aa27fc49 Jan 21 15:47:47 crc kubenswrapper[4773]: I0121 15:47:47.413043 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" event={"ID":"13e848be-010f-4797-b0cc-a5ff046c55c8","Type":"ContainerStarted","Data":"24d30802af6b57a692a8baea3aea2ce72a8c875fec2926ce0de2141e0939fe7a"} Jan 21 15:47:47 crc kubenswrapper[4773]: I0121 15:47:47.429410 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d34079f2-2d08-4ddc-8d49-a9afaadaba8c","Type":"ContainerStarted","Data":"f70f7427d47123c7395611c5c4fac101422fd1d3d231f253202e12666ce637aa"} Jan 21 15:47:47 crc kubenswrapper[4773]: I0121 15:47:47.432934 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81537761-2fff-47ab-b6bf-d8bc5d14d228","Type":"ContainerStarted","Data":"1d3bffd4a7af84d06998a91d2378dd2c080e3a22b3ed4092a2d88ac1604f9dda"} Jan 21 15:47:47 crc kubenswrapper[4773]: I0121 15:47:47.450383 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecb64220-6224-45f9-ad72-881974ee8a05","Type":"ContainerStarted","Data":"f81be3980f8b5742ed58233733ec8e12a64a20a02b12da9c401b5173aa27fc49"} Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.169117 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.473816 4773 generic.go:334] "Generic (PLEG): container finished" podID="13e848be-010f-4797-b0cc-a5ff046c55c8" containerID="0518ae775e48707147dddeeb8f0262773b33a78705b42a67bb0dca0a5f09853f" exitCode=0 Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.474738 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" event={"ID":"13e848be-010f-4797-b0cc-a5ff046c55c8","Type":"ContainerDied","Data":"0518ae775e48707147dddeeb8f0262773b33a78705b42a67bb0dca0a5f09853f"} Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.478937 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.521834 4773 generic.go:334] "Generic (PLEG): container finished" podID="133a51b0-d4c1-4515-b260-143df28703df" containerID="4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a" exitCode=0 Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.521882 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd9678444-nmk9w" event={"ID":"133a51b0-d4c1-4515-b260-143df28703df","Type":"ContainerDied","Data":"4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a"} Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.521912 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd9678444-nmk9w" event={"ID":"133a51b0-d4c1-4515-b260-143df28703df","Type":"ContainerDied","Data":"196e08daaf69876c023c03ec9602563fa9350b526f68381f33630239f39eb746"} Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.521934 4773 scope.go:117] "RemoveContainer" containerID="33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.559605 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-combined-ca-bundle\") pod \"133a51b0-d4c1-4515-b260-143df28703df\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.559682 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-httpd-config\") pod \"133a51b0-d4c1-4515-b260-143df28703df\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.559896 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sr5v\" (UniqueName: \"kubernetes.io/projected/133a51b0-d4c1-4515-b260-143df28703df-kube-api-access-6sr5v\") pod \"133a51b0-d4c1-4515-b260-143df28703df\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.559975 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-config\") pod \"133a51b0-d4c1-4515-b260-143df28703df\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.560031 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-ovndb-tls-certs\") pod \"133a51b0-d4c1-4515-b260-143df28703df\" (UID: \"133a51b0-d4c1-4515-b260-143df28703df\") " Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.586512 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133a51b0-d4c1-4515-b260-143df28703df-kube-api-access-6sr5v" (OuterVolumeSpecName: "kube-api-access-6sr5v") pod "133a51b0-d4c1-4515-b260-143df28703df" (UID: "133a51b0-d4c1-4515-b260-143df28703df"). InnerVolumeSpecName "kube-api-access-6sr5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.597937 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "133a51b0-d4c1-4515-b260-143df28703df" (UID: "133a51b0-d4c1-4515-b260-143df28703df"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.650029 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "133a51b0-d4c1-4515-b260-143df28703df" (UID: "133a51b0-d4c1-4515-b260-143df28703df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.666412 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sr5v\" (UniqueName: \"kubernetes.io/projected/133a51b0-d4c1-4515-b260-143df28703df-kube-api-access-6sr5v\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.666560 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.666571 4773 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.761790 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-config" (OuterVolumeSpecName: "config") pod "133a51b0-d4c1-4515-b260-143df28703df" (UID: "133a51b0-d4c1-4515-b260-143df28703df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.769333 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.799177 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "133a51b0-d4c1-4515-b260-143df28703df" (UID: "133a51b0-d4c1-4515-b260-143df28703df"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.842243 4773 scope.go:117] "RemoveContainer" containerID="4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.871040 4773 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/133a51b0-d4c1-4515-b260-143df28703df-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.879559 4773 scope.go:117] "RemoveContainer" containerID="33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a" Jan 21 15:47:48 crc kubenswrapper[4773]: E0121 15:47:48.880056 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a\": container with ID starting with 33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a not found: ID does not exist" containerID="33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.880080 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a"} err="failed to get container status \"33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a\": rpc error: code = NotFound desc = could not find container \"33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a\": container with ID starting with 33520bccfe4acf7b5bf8d10806fe82b515177d51e2828c1260f096a3acd2f35a not found: ID does not exist" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.880102 4773 scope.go:117] "RemoveContainer" containerID="4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a" Jan 21 15:47:48 crc kubenswrapper[4773]: E0121 15:47:48.880775 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a\": container with ID starting with 4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a not found: ID does not exist" containerID="4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a" Jan 21 15:47:48 crc kubenswrapper[4773]: I0121 15:47:48.880813 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a"} err="failed to get container status \"4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a\": rpc error: code = NotFound desc = could not find container \"4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a\": container with ID starting with 4b47e6f59ac014b7882b50e0eaf2fd4244405c85ac5afd8f41325806293d724a not found: ID does not exist" Jan 21 15:47:49 crc kubenswrapper[4773]: I0121 15:47:49.587160 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" event={"ID":"13e848be-010f-4797-b0cc-a5ff046c55c8","Type":"ContainerStarted","Data":"b758c5f4bd744eb212b4eb5cdd14c37d44a004cd4fff5184c06a2cf6552283ad"} Jan 21 15:47:49 crc kubenswrapper[4773]: I0121 15:47:49.587857 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:49 crc kubenswrapper[4773]: I0121 15:47:49.597300 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81537761-2fff-47ab-b6bf-d8bc5d14d228","Type":"ContainerStarted","Data":"65336c42cda405dbe37e125660327bccfef75e0eae15549918240c0c3f8b4376"} Jan 21 15:47:49 crc kubenswrapper[4773]: I0121 15:47:49.603742 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd9678444-nmk9w" Jan 21 15:47:49 crc kubenswrapper[4773]: I0121 15:47:49.616622 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecb64220-6224-45f9-ad72-881974ee8a05","Type":"ContainerStarted","Data":"fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6"} Jan 21 15:47:49 crc kubenswrapper[4773]: I0121 15:47:49.636993 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" podStartSLOduration=4.636971172 podStartE2EDuration="4.636971172s" podCreationTimestamp="2026-01-21 15:47:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:49.617226271 +0000 UTC m=+1434.541715903" watchObservedRunningTime="2026-01-21 15:47:49.636971172 +0000 UTC m=+1434.561460794" Jan 21 15:47:49 crc kubenswrapper[4773]: I0121 15:47:49.662706 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cd9678444-nmk9w"] Jan 21 15:47:49 crc kubenswrapper[4773]: I0121 15:47:49.716652 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-cd9678444-nmk9w"] Jan 21 15:47:50 crc kubenswrapper[4773]: I0121 15:47:50.631065 4773 generic.go:334] "Generic (PLEG): container finished" podID="acd3ea21-489e-4693-8c06-1ec6af224609" containerID="f2c7b6ef123f16d70ab794a89c465b4d07965eab16e83e431aa8d70489a42d31" exitCode=0 Jan 21 15:47:50 crc kubenswrapper[4773]: I0121 15:47:50.631151 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9pxhf" event={"ID":"acd3ea21-489e-4693-8c06-1ec6af224609","Type":"ContainerDied","Data":"f2c7b6ef123f16d70ab794a89c465b4d07965eab16e83e431aa8d70489a42d31"} Jan 21 15:47:50 crc kubenswrapper[4773]: I0121 15:47:50.634819 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecb64220-6224-45f9-ad72-881974ee8a05","Type":"ContainerStarted","Data":"71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90"} Jan 21 15:47:50 crc kubenswrapper[4773]: I0121 15:47:50.635147 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ecb64220-6224-45f9-ad72-881974ee8a05" containerName="cinder-api" containerID="cri-o://71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90" gracePeriod=30 Jan 21 15:47:50 crc kubenswrapper[4773]: I0121 15:47:50.635198 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 21 15:47:50 crc kubenswrapper[4773]: I0121 15:47:50.635270 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ecb64220-6224-45f9-ad72-881974ee8a05" containerName="cinder-api-log" containerID="cri-o://fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6" gracePeriod=30 Jan 21 15:47:50 crc kubenswrapper[4773]: I0121 15:47:50.644876 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81537761-2fff-47ab-b6bf-d8bc5d14d228","Type":"ContainerStarted","Data":"15df608948326e527aa465468ab2efcdcacadd9642187de89fa4409e4cf7d3c2"} Jan 21 15:47:50 crc kubenswrapper[4773]: I0121 15:47:50.684343 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.684318549 podStartE2EDuration="5.684318549s" podCreationTimestamp="2026-01-21 15:47:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:50.674628009 +0000 UTC m=+1435.599117651" watchObservedRunningTime="2026-01-21 15:47:50.684318549 +0000 UTC m=+1435.608808181" Jan 21 15:47:50 crc kubenswrapper[4773]: I0121 15:47:50.713337 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.635820582 podStartE2EDuration="5.713318999s" podCreationTimestamp="2026-01-21 15:47:45 +0000 UTC" firstStartedPulling="2026-01-21 15:47:46.805056333 +0000 UTC m=+1431.729545965" lastFinishedPulling="2026-01-21 15:47:47.88255476 +0000 UTC m=+1432.807044382" observedRunningTime="2026-01-21 15:47:50.71149226 +0000 UTC m=+1435.635981892" watchObservedRunningTime="2026-01-21 15:47:50.713318999 +0000 UTC m=+1435.637808621" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.106806 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.422917 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="133a51b0-d4c1-4515-b260-143df28703df" path="/var/lib/kubelet/pods/133a51b0-d4c1-4515-b260-143df28703df/volumes" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.519803 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.646691 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n5gt\" (UniqueName: \"kubernetes.io/projected/ecb64220-6224-45f9-ad72-881974ee8a05-kube-api-access-4n5gt\") pod \"ecb64220-6224-45f9-ad72-881974ee8a05\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.646986 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecb64220-6224-45f9-ad72-881974ee8a05-logs\") pod \"ecb64220-6224-45f9-ad72-881974ee8a05\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.647061 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecb64220-6224-45f9-ad72-881974ee8a05-etc-machine-id\") pod \"ecb64220-6224-45f9-ad72-881974ee8a05\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.647106 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data\") pod \"ecb64220-6224-45f9-ad72-881974ee8a05\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.647143 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-combined-ca-bundle\") pod \"ecb64220-6224-45f9-ad72-881974ee8a05\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.647240 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecb64220-6224-45f9-ad72-881974ee8a05-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ecb64220-6224-45f9-ad72-881974ee8a05" (UID: "ecb64220-6224-45f9-ad72-881974ee8a05"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.647295 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data-custom\") pod \"ecb64220-6224-45f9-ad72-881974ee8a05\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.647405 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecb64220-6224-45f9-ad72-881974ee8a05-logs" (OuterVolumeSpecName: "logs") pod "ecb64220-6224-45f9-ad72-881974ee8a05" (UID: "ecb64220-6224-45f9-ad72-881974ee8a05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.647428 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-scripts\") pod \"ecb64220-6224-45f9-ad72-881974ee8a05\" (UID: \"ecb64220-6224-45f9-ad72-881974ee8a05\") " Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.648450 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecb64220-6224-45f9-ad72-881974ee8a05-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.648477 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecb64220-6224-45f9-ad72-881974ee8a05-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.657973 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecb64220-6224-45f9-ad72-881974ee8a05-kube-api-access-4n5gt" (OuterVolumeSpecName: "kube-api-access-4n5gt") pod "ecb64220-6224-45f9-ad72-881974ee8a05" (UID: "ecb64220-6224-45f9-ad72-881974ee8a05"). InnerVolumeSpecName "kube-api-access-4n5gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.664526 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ecb64220-6224-45f9-ad72-881974ee8a05" (UID: "ecb64220-6224-45f9-ad72-881974ee8a05"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.671523 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-scripts" (OuterVolumeSpecName: "scripts") pod "ecb64220-6224-45f9-ad72-881974ee8a05" (UID: "ecb64220-6224-45f9-ad72-881974ee8a05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.715929 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecb64220-6224-45f9-ad72-881974ee8a05" (UID: "ecb64220-6224-45f9-ad72-881974ee8a05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.726795 4773 generic.go:334] "Generic (PLEG): container finished" podID="ecb64220-6224-45f9-ad72-881974ee8a05" containerID="71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90" exitCode=0 Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.726834 4773 generic.go:334] "Generic (PLEG): container finished" podID="ecb64220-6224-45f9-ad72-881974ee8a05" containerID="fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6" exitCode=143 Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.727080 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.727121 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecb64220-6224-45f9-ad72-881974ee8a05","Type":"ContainerDied","Data":"71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90"} Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.727176 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecb64220-6224-45f9-ad72-881974ee8a05","Type":"ContainerDied","Data":"fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6"} Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.727188 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecb64220-6224-45f9-ad72-881974ee8a05","Type":"ContainerDied","Data":"f81be3980f8b5742ed58233733ec8e12a64a20a02b12da9c401b5173aa27fc49"} Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.727207 4773 scope.go:117] "RemoveContainer" containerID="71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.749938 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data" (OuterVolumeSpecName: "config-data") pod "ecb64220-6224-45f9-ad72-881974ee8a05" (UID: "ecb64220-6224-45f9-ad72-881974ee8a05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.750640 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.750664 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.750682 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.750760 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecb64220-6224-45f9-ad72-881974ee8a05-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.750775 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n5gt\" (UniqueName: \"kubernetes.io/projected/ecb64220-6224-45f9-ad72-881974ee8a05-kube-api-access-4n5gt\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.826444 4773 scope.go:117] "RemoveContainer" containerID="fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.870913 4773 scope.go:117] "RemoveContainer" containerID="71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90" Jan 21 15:47:51 crc kubenswrapper[4773]: E0121 15:47:51.871940 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90\": container with ID starting with 71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90 not found: ID does not exist" containerID="71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.871984 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90"} err="failed to get container status \"71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90\": rpc error: code = NotFound desc = could not find container \"71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90\": container with ID starting with 71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90 not found: ID does not exist" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.872011 4773 scope.go:117] "RemoveContainer" containerID="fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6" Jan 21 15:47:51 crc kubenswrapper[4773]: E0121 15:47:51.872673 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6\": container with ID starting with fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6 not found: ID does not exist" containerID="fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.872733 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6"} err="failed to get container status \"fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6\": rpc error: code = NotFound desc = could not find container \"fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6\": container with ID starting with fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6 not found: ID does not exist" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.872763 4773 scope.go:117] "RemoveContainer" containerID="71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.873188 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90"} err="failed to get container status \"71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90\": rpc error: code = NotFound desc = could not find container \"71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90\": container with ID starting with 71172fb787804f3a92bfaf9993728963700a629ddd890beffa445d5ff5967f90 not found: ID does not exist" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.873216 4773 scope.go:117] "RemoveContainer" containerID="fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6" Jan 21 15:47:51 crc kubenswrapper[4773]: I0121 15:47:51.873460 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6"} err="failed to get container status \"fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6\": rpc error: code = NotFound desc = could not find container \"fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6\": container with ID starting with fbfd67df3c36e67d07c78fb98036c74989fda1e4f3dc1f356ebe3ff552efb7a6 not found: ID does not exist" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.103483 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.118251 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.137947 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 21 15:47:52 crc kubenswrapper[4773]: E0121 15:47:52.138419 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb64220-6224-45f9-ad72-881974ee8a05" containerName="cinder-api-log" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.138433 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb64220-6224-45f9-ad72-881974ee8a05" containerName="cinder-api-log" Jan 21 15:47:52 crc kubenswrapper[4773]: E0121 15:47:52.138456 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133a51b0-d4c1-4515-b260-143df28703df" containerName="neutron-httpd" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.138464 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="133a51b0-d4c1-4515-b260-143df28703df" containerName="neutron-httpd" Jan 21 15:47:52 crc kubenswrapper[4773]: E0121 15:47:52.138507 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133a51b0-d4c1-4515-b260-143df28703df" containerName="neutron-api" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.138518 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="133a51b0-d4c1-4515-b260-143df28703df" containerName="neutron-api" Jan 21 15:47:52 crc kubenswrapper[4773]: E0121 15:47:52.138531 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb64220-6224-45f9-ad72-881974ee8a05" containerName="cinder-api" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.138538 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb64220-6224-45f9-ad72-881974ee8a05" containerName="cinder-api" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.138799 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecb64220-6224-45f9-ad72-881974ee8a05" containerName="cinder-api" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.138814 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecb64220-6224-45f9-ad72-881974ee8a05" containerName="cinder-api-log" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.138827 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="133a51b0-d4c1-4515-b260-143df28703df" containerName="neutron-api" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.138845 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="133a51b0-d4c1-4515-b260-143df28703df" containerName="neutron-httpd" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.140085 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.145389 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.145651 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.145824 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.150787 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.260169 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68s5n\" (UniqueName: \"kubernetes.io/projected/4453fa11-ade2-4d7d-a714-67525df64b70-kube-api-access-68s5n\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.260240 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4453fa11-ade2-4d7d-a714-67525df64b70-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.260304 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4453fa11-ade2-4d7d-a714-67525df64b70-logs\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.260342 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-config-data-custom\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.260389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.260457 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.260482 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-scripts\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.260506 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-config-data\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.260574 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.317763 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.319759 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6dcbb8dcff-bjhnc"] Jan 21 15:47:52 crc kubenswrapper[4773]: E0121 15:47:52.320630 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd3ea21-489e-4693-8c06-1ec6af224609" containerName="cloudkitty-storageinit" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.320661 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd3ea21-489e-4693-8c06-1ec6af224609" containerName="cloudkitty-storageinit" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.321021 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd3ea21-489e-4693-8c06-1ec6af224609" containerName="cloudkitty-storageinit" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.323217 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.330672 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.330802 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.331985 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.339909 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6dcbb8dcff-bjhnc"] Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.362585 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-config-data\") pod \"acd3ea21-489e-4693-8c06-1ec6af224609\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.362651 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-combined-ca-bundle\") pod \"acd3ea21-489e-4693-8c06-1ec6af224609\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.362686 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-certs\") pod \"acd3ea21-489e-4693-8c06-1ec6af224609\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.362828 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpssn\" (UniqueName: \"kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-kube-api-access-rpssn\") pod \"acd3ea21-489e-4693-8c06-1ec6af224609\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.362910 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-scripts\") pod \"acd3ea21-489e-4693-8c06-1ec6af224609\" (UID: \"acd3ea21-489e-4693-8c06-1ec6af224609\") " Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.363270 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4453fa11-ade2-4d7d-a714-67525df64b70-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.363336 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4453fa11-ade2-4d7d-a714-67525df64b70-logs\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.363368 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-config-data-custom\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.363412 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.363514 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.363541 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-scripts\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.363571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-config-data\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.363646 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.363685 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68s5n\" (UniqueName: \"kubernetes.io/projected/4453fa11-ade2-4d7d-a714-67525df64b70-kube-api-access-68s5n\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.369938 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.371636 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4453fa11-ade2-4d7d-a714-67525df64b70-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.372073 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4453fa11-ade2-4d7d-a714-67525df64b70-logs\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.374087 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-scripts" (OuterVolumeSpecName: "scripts") pod "acd3ea21-489e-4693-8c06-1ec6af224609" (UID: "acd3ea21-489e-4693-8c06-1ec6af224609"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.379364 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.380261 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-config-data-custom\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.388182 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-scripts\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.390377 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68s5n\" (UniqueName: \"kubernetes.io/projected/4453fa11-ade2-4d7d-a714-67525df64b70-kube-api-access-68s5n\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.392161 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-certs" (OuterVolumeSpecName: "certs") pod "acd3ea21-489e-4693-8c06-1ec6af224609" (UID: "acd3ea21-489e-4693-8c06-1ec6af224609"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.402267 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.405558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4453fa11-ade2-4d7d-a714-67525df64b70-config-data\") pod \"cinder-api-0\" (UID: \"4453fa11-ade2-4d7d-a714-67525df64b70\") " pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.407619 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-kube-api-access-rpssn" (OuterVolumeSpecName: "kube-api-access-rpssn") pod "acd3ea21-489e-4693-8c06-1ec6af224609" (UID: "acd3ea21-489e-4693-8c06-1ec6af224609"). InnerVolumeSpecName "kube-api-access-rpssn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.452947 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-config-data" (OuterVolumeSpecName: "config-data") pod "acd3ea21-489e-4693-8c06-1ec6af224609" (UID: "acd3ea21-489e-4693-8c06-1ec6af224609"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465408 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/22988650-1474-4ba4-a6c0-2deb003ae3e7-etc-swift\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465452 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22988650-1474-4ba4-a6c0-2deb003ae3e7-run-httpd\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465473 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpwls\" (UniqueName: \"kubernetes.io/projected/22988650-1474-4ba4-a6c0-2deb003ae3e7-kube-api-access-kpwls\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465494 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-combined-ca-bundle\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465522 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22988650-1474-4ba4-a6c0-2deb003ae3e7-log-httpd\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465613 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-public-tls-certs\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465675 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-config-data\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465773 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-internal-tls-certs\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465897 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465931 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465941 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.465949 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpssn\" (UniqueName: \"kubernetes.io/projected/acd3ea21-489e-4693-8c06-1ec6af224609-kube-api-access-rpssn\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.487438 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.506495 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acd3ea21-489e-4693-8c06-1ec6af224609" (UID: "acd3ea21-489e-4693-8c06-1ec6af224609"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.568011 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/22988650-1474-4ba4-a6c0-2deb003ae3e7-etc-swift\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.568054 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22988650-1474-4ba4-a6c0-2deb003ae3e7-run-httpd\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.568079 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpwls\" (UniqueName: \"kubernetes.io/projected/22988650-1474-4ba4-a6c0-2deb003ae3e7-kube-api-access-kpwls\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.568101 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-combined-ca-bundle\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.568128 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22988650-1474-4ba4-a6c0-2deb003ae3e7-log-httpd\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.568193 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-public-tls-certs\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.568224 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-config-data\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.568314 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-internal-tls-certs\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.568364 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd3ea21-489e-4693-8c06-1ec6af224609-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.569460 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22988650-1474-4ba4-a6c0-2deb003ae3e7-run-httpd\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.572569 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-combined-ca-bundle\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.572994 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22988650-1474-4ba4-a6c0-2deb003ae3e7-log-httpd\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.576557 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-internal-tls-certs\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.576783 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-public-tls-certs\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.577491 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22988650-1474-4ba4-a6c0-2deb003ae3e7-config-data\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.586968 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/22988650-1474-4ba4-a6c0-2deb003ae3e7-etc-swift\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.600529 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpwls\" (UniqueName: \"kubernetes.io/projected/22988650-1474-4ba4-a6c0-2deb003ae3e7-kube-api-access-kpwls\") pod \"swift-proxy-6dcbb8dcff-bjhnc\" (UID: \"22988650-1474-4ba4-a6c0-2deb003ae3e7\") " pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.674133 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.794308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9pxhf" event={"ID":"acd3ea21-489e-4693-8c06-1ec6af224609","Type":"ContainerDied","Data":"d1e513f54d757daa4108880ad77e403049d3969464d25790b26edffec515d2d6"} Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.794360 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1e513f54d757daa4108880ad77e403049d3969464d25790b26edffec515d2d6" Jan 21 15:47:52 crc kubenswrapper[4773]: I0121 15:47:52.794458 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9pxhf" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.009389 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.011296 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.019046 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-x9snq" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.019445 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.019574 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.019689 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.019841 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.041248 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.111212 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.111376 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mb74\" (UniqueName: \"kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-kube-api-access-4mb74\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.111508 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.111588 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-certs\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.111670 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-scripts\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.111760 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.190112 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-c2frn"] Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.190589 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" podUID="13e848be-010f-4797-b0cc-a5ff046c55c8" containerName="dnsmasq-dns" containerID="cri-o://b758c5f4bd744eb212b4eb5cdd14c37d44a004cd4fff5184c06a2cf6552283ad" gracePeriod=10 Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.206899 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.224068 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.224165 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.224214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mb74\" (UniqueName: \"kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-kube-api-access-4mb74\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.224262 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.224294 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-certs\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.224326 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-scripts\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.246565 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.248276 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-certs\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.250576 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.259875 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-k4dvm"] Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.262947 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.294471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mb74\" (UniqueName: \"kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-kube-api-access-4mb74\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.296607 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.322515 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-scripts\") pod \"cloudkitty-proc-0\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.326112 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.351855 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.351953 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-svc\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.352134 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zfbz\" (UniqueName: \"kubernetes.io/projected/65527bca-7849-47c3-ad54-11916b724542-kube-api-access-8zfbz\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.352389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.352481 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-config\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.342137 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.388762 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.456978 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-svc\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.461589 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zfbz\" (UniqueName: \"kubernetes.io/projected/65527bca-7849-47c3-ad54-11916b724542-kube-api-access-8zfbz\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.462109 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.462270 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-config\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.462411 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.462542 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.463485 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.459918 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-svc\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.465166 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.470409 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-config\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.478774 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.509292 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecb64220-6224-45f9-ad72-881974ee8a05" path="/var/lib/kubelet/pods/ecb64220-6224-45f9-ad72-881974ee8a05/volumes" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.510308 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-k4dvm"] Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.515111 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zfbz\" (UniqueName: \"kubernetes.io/projected/65527bca-7849-47c3-ad54-11916b724542-kube-api-access-8zfbz\") pod \"dnsmasq-dns-67bdc55879-k4dvm\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.645868 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.648385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.651222 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.661053 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.757405 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.777556 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.777661 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74k2w\" (UniqueName: \"kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-kube-api-access-74k2w\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.777725 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.777760 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-scripts\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.777843 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4511970-2daa-41c2-b649-96144b875bee-logs\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.777864 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-certs\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.777933 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.881563 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74k2w\" (UniqueName: \"kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-kube-api-access-74k2w\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.881645 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.881683 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-scripts\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.881782 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4511970-2daa-41c2-b649-96144b875bee-logs\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.881804 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-certs\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.881869 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.881964 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.882323 4773 generic.go:334] "Generic (PLEG): container finished" podID="13e848be-010f-4797-b0cc-a5ff046c55c8" containerID="b758c5f4bd744eb212b4eb5cdd14c37d44a004cd4fff5184c06a2cf6552283ad" exitCode=0 Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.882412 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" event={"ID":"13e848be-010f-4797-b0cc-a5ff046c55c8","Type":"ContainerDied","Data":"b758c5f4bd744eb212b4eb5cdd14c37d44a004cd4fff5184c06a2cf6552283ad"} Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.892836 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.893220 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4511970-2daa-41c2-b649-96144b875bee-logs\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.909300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-scripts\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.918927 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4453fa11-ade2-4d7d-a714-67525df64b70","Type":"ContainerStarted","Data":"f7a6eff0158d3048011f2abf89de1644e0f0b7e52e0a706787093da28d7b7983"} Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.922537 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.924385 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74k2w\" (UniqueName: \"kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-kube-api-access-74k2w\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.928399 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.947087 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-certs\") pod \"cloudkitty-api-0\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " pod="openstack/cloudkitty-api-0" Jan 21 15:47:53 crc kubenswrapper[4773]: I0121 15:47:53.985615 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.034642 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6dcbb8dcff-bjhnc"] Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.285414 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.719404 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.807630 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-svc\") pod \"13e848be-010f-4797-b0cc-a5ff046c55c8\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.807976 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-nb\") pod \"13e848be-010f-4797-b0cc-a5ff046c55c8\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.808153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-config\") pod \"13e848be-010f-4797-b0cc-a5ff046c55c8\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.808251 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkbmh\" (UniqueName: \"kubernetes.io/projected/13e848be-010f-4797-b0cc-a5ff046c55c8-kube-api-access-rkbmh\") pod \"13e848be-010f-4797-b0cc-a5ff046c55c8\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.812217 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-swift-storage-0\") pod \"13e848be-010f-4797-b0cc-a5ff046c55c8\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.812313 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-sb\") pod \"13e848be-010f-4797-b0cc-a5ff046c55c8\" (UID: \"13e848be-010f-4797-b0cc-a5ff046c55c8\") " Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.817559 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13e848be-010f-4797-b0cc-a5ff046c55c8-kube-api-access-rkbmh" (OuterVolumeSpecName: "kube-api-access-rkbmh") pod "13e848be-010f-4797-b0cc-a5ff046c55c8" (UID: "13e848be-010f-4797-b0cc-a5ff046c55c8"). InnerVolumeSpecName "kube-api-access-rkbmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.821433 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-k4dvm"] Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.916726 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkbmh\" (UniqueName: \"kubernetes.io/projected/13e848be-010f-4797-b0cc-a5ff046c55c8-kube-api-access-rkbmh\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:54 crc kubenswrapper[4773]: I0121 15:47:54.984573 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "13e848be-010f-4797-b0cc-a5ff046c55c8" (UID: "13e848be-010f-4797-b0cc-a5ff046c55c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.016186 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" event={"ID":"65527bca-7849-47c3-ad54-11916b724542","Type":"ContainerStarted","Data":"120b79aa5084dfd88c537bcd04bd5032ef3d7786cbc9c0afd60c2e1c1db62139"} Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.018346 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.020646 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"ac95d669-f09a-43e9-a44e-088a7761fba8","Type":"ContainerStarted","Data":"6451e01cac51a1712392f085cf104b6af0eef5a5b90dcf5f6aca588261daf8c8"} Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.029325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" event={"ID":"22988650-1474-4ba4-a6c0-2deb003ae3e7","Type":"ContainerStarted","Data":"b98b7d25d54183e7bffaafa1e02cf0a8f24fc2882bb96e31728e73e7b8066590"} Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.029371 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" event={"ID":"22988650-1474-4ba4-a6c0-2deb003ae3e7","Type":"ContainerStarted","Data":"915e69fcdbb3badda9e79d72614f8640de0f176954aef765ce46f79750bf6d38"} Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.037659 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-config" (OuterVolumeSpecName: "config") pod "13e848be-010f-4797-b0cc-a5ff046c55c8" (UID: "13e848be-010f-4797-b0cc-a5ff046c55c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.048527 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.048540 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-c2frn" event={"ID":"13e848be-010f-4797-b0cc-a5ff046c55c8","Type":"ContainerDied","Data":"24d30802af6b57a692a8baea3aea2ce72a8c875fec2926ce0de2141e0939fe7a"} Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.048617 4773 scope.go:117] "RemoveContainer" containerID="b758c5f4bd744eb212b4eb5cdd14c37d44a004cd4fff5184c06a2cf6552283ad" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.051050 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.056115 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "13e848be-010f-4797-b0cc-a5ff046c55c8" (UID: "13e848be-010f-4797-b0cc-a5ff046c55c8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.064947 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4453fa11-ade2-4d7d-a714-67525df64b70","Type":"ContainerStarted","Data":"f78350aa800c06a0c200129515395e58da325f22326df6606749765789a5f573"} Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.068974 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13e848be-010f-4797-b0cc-a5ff046c55c8" (UID: "13e848be-010f-4797-b0cc-a5ff046c55c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.119712 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "13e848be-010f-4797-b0cc-a5ff046c55c8" (UID: "13e848be-010f-4797-b0cc-a5ff046c55c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.120018 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.120092 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.120104 4773 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.222931 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13e848be-010f-4797-b0cc-a5ff046c55c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.294440 4773 scope.go:117] "RemoveContainer" containerID="0518ae775e48707147dddeeb8f0262773b33a78705b42a67bb0dca0a5f09853f" Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.676825 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-c2frn"] Jan 21 15:47:55 crc kubenswrapper[4773]: I0121 15:47:55.690862 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-c2frn"] Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.005748 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.006050 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="ceilometer-central-agent" containerID="cri-o://0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631" gracePeriod=30 Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.006916 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="proxy-httpd" containerID="cri-o://e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f" gracePeriod=30 Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.006976 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="sg-core" containerID="cri-o://019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1" gracePeriod=30 Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.007016 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="ceilometer-notification-agent" containerID="cri-o://cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9" gracePeriod=30 Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.035780 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.184:3000/\": EOF" Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.102479 4773 generic.go:334] "Generic (PLEG): container finished" podID="65527bca-7849-47c3-ad54-11916b724542" containerID="98ab4bf86fb27e5ce91e6ac8cadda15ae4e9ae56de3cf88dd4f17e0378bf7690" exitCode=0 Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.102568 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" event={"ID":"65527bca-7849-47c3-ad54-11916b724542","Type":"ContainerDied","Data":"98ab4bf86fb27e5ce91e6ac8cadda15ae4e9ae56de3cf88dd4f17e0378bf7690"} Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.110150 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c4511970-2daa-41c2-b649-96144b875bee","Type":"ContainerStarted","Data":"f901320deee653211c9902ad31a24604874f479a98d460695a145bf0ad670cdb"} Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.110488 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c4511970-2daa-41c2-b649-96144b875bee","Type":"ContainerStarted","Data":"c2519f55abe4b3833965797aaf1fe76f7b58d5498502239a59b55e4a3f520015"} Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.121261 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" event={"ID":"22988650-1474-4ba4-a6c0-2deb003ae3e7","Type":"ContainerStarted","Data":"7939251d5d082a99bf6a76ba2389339f727bfbacbe079c6a0e086d350d8aceae"} Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.122394 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.122436 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.179774 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" podStartSLOduration=4.179750958 podStartE2EDuration="4.179750958s" podCreationTimestamp="2026-01-21 15:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:56.175548495 +0000 UTC m=+1441.100038127" watchObservedRunningTime="2026-01-21 15:47:56.179750958 +0000 UTC m=+1441.104240580" Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.283073 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.675623 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 21 15:47:56 crc kubenswrapper[4773]: I0121 15:47:56.745643 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 15:47:57 crc kubenswrapper[4773]: I0121 15:47:57.150525 4773 generic.go:334] "Generic (PLEG): container finished" podID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerID="019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1" exitCode=2 Jan 21 15:47:57 crc kubenswrapper[4773]: I0121 15:47:57.150617 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60bdf67b-df3f-4544-8c2f-08d0b2395149","Type":"ContainerDied","Data":"019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1"} Jan 21 15:47:57 crc kubenswrapper[4773]: I0121 15:47:57.152178 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="81537761-2fff-47ab-b6bf-d8bc5d14d228" containerName="cinder-scheduler" containerID="cri-o://65336c42cda405dbe37e125660327bccfef75e0eae15549918240c0c3f8b4376" gracePeriod=30 Jan 21 15:47:57 crc kubenswrapper[4773]: I0121 15:47:57.152591 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="81537761-2fff-47ab-b6bf-d8bc5d14d228" containerName="probe" containerID="cri-o://15df608948326e527aa465468ab2efcdcacadd9642187de89fa4409e4cf7d3c2" gracePeriod=30 Jan 21 15:47:57 crc kubenswrapper[4773]: I0121 15:47:57.399501 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13e848be-010f-4797-b0cc-a5ff046c55c8" path="/var/lib/kubelet/pods/13e848be-010f-4797-b0cc-a5ff046c55c8/volumes" Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.173315 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" event={"ID":"65527bca-7849-47c3-ad54-11916b724542","Type":"ContainerStarted","Data":"118ac53ee6f0e2da22020764297cba6d99a16fca95123276676f78eae861883d"} Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.180763 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c4511970-2daa-41c2-b649-96144b875bee","Type":"ContainerStarted","Data":"4942c0cf13267cc8db102e037b15fcef87ecbce9be74eecb9897e0ebb90e4863"} Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.180843 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.180854 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="c4511970-2daa-41c2-b649-96144b875bee" containerName="cloudkitty-api-log" containerID="cri-o://f901320deee653211c9902ad31a24604874f479a98d460695a145bf0ad670cdb" gracePeriod=30 Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.180960 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="c4511970-2daa-41c2-b649-96144b875bee" containerName="cloudkitty-api" containerID="cri-o://4942c0cf13267cc8db102e037b15fcef87ecbce9be74eecb9897e0ebb90e4863" gracePeriod=30 Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.184488 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4453fa11-ade2-4d7d-a714-67525df64b70","Type":"ContainerStarted","Data":"765678f6baa418d68a6c7338aac8bd1f7efae0cc72ad61e414337888379ef34d"} Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.184746 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.195249 4773 generic.go:334] "Generic (PLEG): container finished" podID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerID="e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f" exitCode=0 Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.195300 4773 generic.go:334] "Generic (PLEG): container finished" podID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerID="0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631" exitCode=0 Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.195379 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60bdf67b-df3f-4544-8c2f-08d0b2395149","Type":"ContainerDied","Data":"e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f"} Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.195418 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60bdf67b-df3f-4544-8c2f-08d0b2395149","Type":"ContainerDied","Data":"0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631"} Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.199788 4773 generic.go:334] "Generic (PLEG): container finished" podID="81537761-2fff-47ab-b6bf-d8bc5d14d228" containerID="15df608948326e527aa465468ab2efcdcacadd9642187de89fa4409e4cf7d3c2" exitCode=0 Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.200744 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81537761-2fff-47ab-b6bf-d8bc5d14d228","Type":"ContainerDied","Data":"15df608948326e527aa465468ab2efcdcacadd9642187de89fa4409e4cf7d3c2"} Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.227010 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=5.226985666 podStartE2EDuration="5.226985666s" podCreationTimestamp="2026-01-21 15:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:58.211381805 +0000 UTC m=+1443.135871427" watchObservedRunningTime="2026-01-21 15:47:58.226985666 +0000 UTC m=+1443.151475288" Jan 21 15:47:58 crc kubenswrapper[4773]: I0121 15:47:58.250966 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.25093944 podStartE2EDuration="6.25093944s" podCreationTimestamp="2026-01-21 15:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:58.23199132 +0000 UTC m=+1443.156480942" watchObservedRunningTime="2026-01-21 15:47:58.25093944 +0000 UTC m=+1443.175429062" Jan 21 15:47:59 crc kubenswrapper[4773]: I0121 15:47:59.287173 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4511970-2daa-41c2-b649-96144b875bee" containerID="f901320deee653211c9902ad31a24604874f479a98d460695a145bf0ad670cdb" exitCode=143 Jan 21 15:47:59 crc kubenswrapper[4773]: I0121 15:47:59.287583 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c4511970-2daa-41c2-b649-96144b875bee","Type":"ContainerDied","Data":"f901320deee653211c9902ad31a24604874f479a98d460695a145bf0ad670cdb"} Jan 21 15:47:59 crc kubenswrapper[4773]: I0121 15:47:59.330035 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"ac95d669-f09a-43e9-a44e-088a7761fba8","Type":"ContainerStarted","Data":"9a449300cb7e533f053422bad57fbc68782fe26814f5b540861ff8092324479f"} Jan 21 15:47:59 crc kubenswrapper[4773]: I0121 15:47:59.347164 4773 generic.go:334] "Generic (PLEG): container finished" podID="81537761-2fff-47ab-b6bf-d8bc5d14d228" containerID="65336c42cda405dbe37e125660327bccfef75e0eae15549918240c0c3f8b4376" exitCode=0 Jan 21 15:47:59 crc kubenswrapper[4773]: I0121 15:47:59.347269 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81537761-2fff-47ab-b6bf-d8bc5d14d228","Type":"ContainerDied","Data":"65336c42cda405dbe37e125660327bccfef75e0eae15549918240c0c3f8b4376"} Jan 21 15:47:59 crc kubenswrapper[4773]: I0121 15:47:59.371022 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=3.566976419 podStartE2EDuration="7.371002412s" podCreationTimestamp="2026-01-21 15:47:52 +0000 UTC" firstStartedPulling="2026-01-21 15:47:54.401370012 +0000 UTC m=+1439.325859624" lastFinishedPulling="2026-01-21 15:47:58.205395995 +0000 UTC m=+1443.129885617" observedRunningTime="2026-01-21 15:47:59.368207977 +0000 UTC m=+1444.292697619" watchObservedRunningTime="2026-01-21 15:47:59.371002412 +0000 UTC m=+1444.295492034" Jan 21 15:47:59 crc kubenswrapper[4773]: I0121 15:47:59.474318 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:47:59 crc kubenswrapper[4773]: I0121 15:47:59.489938 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" podStartSLOduration=6.48989403 podStartE2EDuration="6.48989403s" podCreationTimestamp="2026-01-21 15:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:47:59.413345131 +0000 UTC m=+1444.337834763" watchObservedRunningTime="2026-01-21 15:47:59.48989403 +0000 UTC m=+1444.414383652" Jan 21 15:47:59 crc kubenswrapper[4773]: I0121 15:47:59.952026 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.022870 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n24r\" (UniqueName: \"kubernetes.io/projected/81537761-2fff-47ab-b6bf-d8bc5d14d228-kube-api-access-5n24r\") pod \"81537761-2fff-47ab-b6bf-d8bc5d14d228\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.022928 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data\") pod \"81537761-2fff-47ab-b6bf-d8bc5d14d228\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.022975 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-scripts\") pod \"81537761-2fff-47ab-b6bf-d8bc5d14d228\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.023070 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81537761-2fff-47ab-b6bf-d8bc5d14d228-etc-machine-id\") pod \"81537761-2fff-47ab-b6bf-d8bc5d14d228\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.023125 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-combined-ca-bundle\") pod \"81537761-2fff-47ab-b6bf-d8bc5d14d228\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.023148 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data-custom\") pod \"81537761-2fff-47ab-b6bf-d8bc5d14d228\" (UID: \"81537761-2fff-47ab-b6bf-d8bc5d14d228\") " Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.023332 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81537761-2fff-47ab-b6bf-d8bc5d14d228-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "81537761-2fff-47ab-b6bf-d8bc5d14d228" (UID: "81537761-2fff-47ab-b6bf-d8bc5d14d228"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.023646 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81537761-2fff-47ab-b6bf-d8bc5d14d228-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.040025 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81537761-2fff-47ab-b6bf-d8bc5d14d228-kube-api-access-5n24r" (OuterVolumeSpecName: "kube-api-access-5n24r") pod "81537761-2fff-47ab-b6bf-d8bc5d14d228" (UID: "81537761-2fff-47ab-b6bf-d8bc5d14d228"). InnerVolumeSpecName "kube-api-access-5n24r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.043907 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-scripts" (OuterVolumeSpecName: "scripts") pod "81537761-2fff-47ab-b6bf-d8bc5d14d228" (UID: "81537761-2fff-47ab-b6bf-d8bc5d14d228"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.047969 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "81537761-2fff-47ab-b6bf-d8bc5d14d228" (UID: "81537761-2fff-47ab-b6bf-d8bc5d14d228"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.127238 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.127492 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.127590 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n24r\" (UniqueName: \"kubernetes.io/projected/81537761-2fff-47ab-b6bf-d8bc5d14d228-kube-api-access-5n24r\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.171110 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81537761-2fff-47ab-b6bf-d8bc5d14d228" (UID: "81537761-2fff-47ab-b6bf-d8bc5d14d228"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.229679 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.252757 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data" (OuterVolumeSpecName: "config-data") pod "81537761-2fff-47ab-b6bf-d8bc5d14d228" (UID: "81537761-2fff-47ab-b6bf-d8bc5d14d228"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.331953 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81537761-2fff-47ab-b6bf-d8bc5d14d228-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.379124 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.380132 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"81537761-2fff-47ab-b6bf-d8bc5d14d228","Type":"ContainerDied","Data":"1d3bffd4a7af84d06998a91d2378dd2c080e3a22b3ed4092a2d88ac1604f9dda"} Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.380188 4773 scope.go:117] "RemoveContainer" containerID="15df608948326e527aa465468ab2efcdcacadd9642187de89fa4409e4cf7d3c2" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.464233 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.481332 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.510075 4773 scope.go:117] "RemoveContainer" containerID="65336c42cda405dbe37e125660327bccfef75e0eae15549918240c0c3f8b4376" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.529395 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 15:48:00 crc kubenswrapper[4773]: E0121 15:48:00.529999 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13e848be-010f-4797-b0cc-a5ff046c55c8" containerName="init" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.530024 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e848be-010f-4797-b0cc-a5ff046c55c8" containerName="init" Jan 21 15:48:00 crc kubenswrapper[4773]: E0121 15:48:00.530050 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81537761-2fff-47ab-b6bf-d8bc5d14d228" containerName="cinder-scheduler" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.530059 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="81537761-2fff-47ab-b6bf-d8bc5d14d228" containerName="cinder-scheduler" Jan 21 15:48:00 crc kubenswrapper[4773]: E0121 15:48:00.530079 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13e848be-010f-4797-b0cc-a5ff046c55c8" containerName="dnsmasq-dns" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.530088 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e848be-010f-4797-b0cc-a5ff046c55c8" containerName="dnsmasq-dns" Jan 21 15:48:00 crc kubenswrapper[4773]: E0121 15:48:00.530112 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81537761-2fff-47ab-b6bf-d8bc5d14d228" containerName="probe" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.530120 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="81537761-2fff-47ab-b6bf-d8bc5d14d228" containerName="probe" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.530355 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="81537761-2fff-47ab-b6bf-d8bc5d14d228" containerName="probe" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.530371 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="13e848be-010f-4797-b0cc-a5ff046c55c8" containerName="dnsmasq-dns" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.530399 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="81537761-2fff-47ab-b6bf-d8bc5d14d228" containerName="cinder-scheduler" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.531918 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.543131 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.569173 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.643914 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-scripts\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.643971 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-config-data\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.644008 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.644055 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6fkv\" (UniqueName: \"kubernetes.io/projected/5c8c084f-2abc-435a-80fc-e8101b086e50-kube-api-access-q6fkv\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.644106 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.644127 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c8c084f-2abc-435a-80fc-e8101b086e50-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.749223 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-config-data\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.749639 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.749727 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6fkv\" (UniqueName: \"kubernetes.io/projected/5c8c084f-2abc-435a-80fc-e8101b086e50-kube-api-access-q6fkv\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.749785 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.749808 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c8c084f-2abc-435a-80fc-e8101b086e50-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.749954 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-scripts\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.763770 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c8c084f-2abc-435a-80fc-e8101b086e50-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.766673 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-scripts\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.769346 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.770265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-config-data\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.774436 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c8c084f-2abc-435a-80fc-e8101b086e50-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.784117 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6fkv\" (UniqueName: \"kubernetes.io/projected/5c8c084f-2abc-435a-80fc-e8101b086e50-kube-api-access-q6fkv\") pod \"cinder-scheduler-0\" (UID: \"5c8c084f-2abc-435a-80fc-e8101b086e50\") " pod="openstack/cinder-scheduler-0" Jan 21 15:48:00 crc kubenswrapper[4773]: I0121 15:48:00.861226 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.318052 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.414126 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81537761-2fff-47ab-b6bf-d8bc5d14d228" path="/var/lib/kubelet/pods/81537761-2fff-47ab-b6bf-d8bc5d14d228/volumes" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.417074 4773 generic.go:334] "Generic (PLEG): container finished" podID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerID="cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9" exitCode=0 Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.417222 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.420067 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60bdf67b-df3f-4544-8c2f-08d0b2395149","Type":"ContainerDied","Data":"cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9"} Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.420205 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60bdf67b-df3f-4544-8c2f-08d0b2395149","Type":"ContainerDied","Data":"e88d50ff186bbce0931bb143c7cfafbfed7e1c991a8fb05148c62e84de784fa9"} Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.420366 4773 scope.go:117] "RemoveContainer" containerID="e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.423523 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="ac95d669-f09a-43e9-a44e-088a7761fba8" containerName="cloudkitty-proc" containerID="cri-o://9a449300cb7e533f053422bad57fbc68782fe26814f5b540861ff8092324479f" gracePeriod=30 Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.464523 4773 scope.go:117] "RemoveContainer" containerID="019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.473924 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-config-data\") pod \"60bdf67b-df3f-4544-8c2f-08d0b2395149\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.474003 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-combined-ca-bundle\") pod \"60bdf67b-df3f-4544-8c2f-08d0b2395149\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.474080 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-log-httpd\") pod \"60bdf67b-df3f-4544-8c2f-08d0b2395149\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.474109 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckxlx\" (UniqueName: \"kubernetes.io/projected/60bdf67b-df3f-4544-8c2f-08d0b2395149-kube-api-access-ckxlx\") pod \"60bdf67b-df3f-4544-8c2f-08d0b2395149\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.474172 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-scripts\") pod \"60bdf67b-df3f-4544-8c2f-08d0b2395149\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.477297 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-sg-core-conf-yaml\") pod \"60bdf67b-df3f-4544-8c2f-08d0b2395149\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.477381 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-run-httpd\") pod \"60bdf67b-df3f-4544-8c2f-08d0b2395149\" (UID: \"60bdf67b-df3f-4544-8c2f-08d0b2395149\") " Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.480119 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "60bdf67b-df3f-4544-8c2f-08d0b2395149" (UID: "60bdf67b-df3f-4544-8c2f-08d0b2395149"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.489105 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "60bdf67b-df3f-4544-8c2f-08d0b2395149" (UID: "60bdf67b-df3f-4544-8c2f-08d0b2395149"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.489931 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60bdf67b-df3f-4544-8c2f-08d0b2395149-kube-api-access-ckxlx" (OuterVolumeSpecName: "kube-api-access-ckxlx") pod "60bdf67b-df3f-4544-8c2f-08d0b2395149" (UID: "60bdf67b-df3f-4544-8c2f-08d0b2395149"). InnerVolumeSpecName "kube-api-access-ckxlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.493519 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-scripts" (OuterVolumeSpecName: "scripts") pod "60bdf67b-df3f-4544-8c2f-08d0b2395149" (UID: "60bdf67b-df3f-4544-8c2f-08d0b2395149"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.512283 4773 scope.go:117] "RemoveContainer" containerID="cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.515681 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "60bdf67b-df3f-4544-8c2f-08d0b2395149" (UID: "60bdf67b-df3f-4544-8c2f-08d0b2395149"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.550463 4773 scope.go:117] "RemoveContainer" containerID="0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.582728 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.582763 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf67b-df3f-4544-8c2f-08d0b2395149-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.582777 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckxlx\" (UniqueName: \"kubernetes.io/projected/60bdf67b-df3f-4544-8c2f-08d0b2395149-kube-api-access-ckxlx\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.582790 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.582801 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.599061 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60bdf67b-df3f-4544-8c2f-08d0b2395149" (UID: "60bdf67b-df3f-4544-8c2f-08d0b2395149"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.625364 4773 scope.go:117] "RemoveContainer" containerID="e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f" Jan 21 15:48:01 crc kubenswrapper[4773]: E0121 15:48:01.626826 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f\": container with ID starting with e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f not found: ID does not exist" containerID="e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.626868 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f"} err="failed to get container status \"e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f\": rpc error: code = NotFound desc = could not find container \"e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f\": container with ID starting with e9abb914e7e7fecc5a7e3edeb9f429dfe8036d864211717f1e78abe1cdca168f not found: ID does not exist" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.626896 4773 scope.go:117] "RemoveContainer" containerID="019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1" Jan 21 15:48:01 crc kubenswrapper[4773]: E0121 15:48:01.630358 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1\": container with ID starting with 019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1 not found: ID does not exist" containerID="019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.630402 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1"} err="failed to get container status \"019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1\": rpc error: code = NotFound desc = could not find container \"019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1\": container with ID starting with 019177d4c3974b11d2aadbe6956643bcabc7cb07c5afce4a0ccb8dc1a04bd3b1 not found: ID does not exist" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.630435 4773 scope.go:117] "RemoveContainer" containerID="cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9" Jan 21 15:48:01 crc kubenswrapper[4773]: E0121 15:48:01.632075 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9\": container with ID starting with cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9 not found: ID does not exist" containerID="cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.632102 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9"} err="failed to get container status \"cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9\": rpc error: code = NotFound desc = could not find container \"cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9\": container with ID starting with cb632e2d1e919beccd1510b14887e220180d2830da806f4436424a8a65ddabf9 not found: ID does not exist" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.632127 4773 scope.go:117] "RemoveContainer" containerID="0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631" Jan 21 15:48:01 crc kubenswrapper[4773]: E0121 15:48:01.640158 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631\": container with ID starting with 0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631 not found: ID does not exist" containerID="0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.640226 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631"} err="failed to get container status \"0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631\": rpc error: code = NotFound desc = could not find container \"0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631\": container with ID starting with 0ed8c5060fbb334a2d00e207c7b78e0b4b80569e1644049ec7dac813c36e3631 not found: ID does not exist" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.657354 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-config-data" (OuterVolumeSpecName: "config-data") pod "60bdf67b-df3f-4544-8c2f-08d0b2395149" (UID: "60bdf67b-df3f-4544-8c2f-08d0b2395149"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.658168 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 15:48:01 crc kubenswrapper[4773]: W0121 15:48:01.671910 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c8c084f_2abc_435a_80fc_e8101b086e50.slice/crio-c47196d45c4b51def5028759ff9bda193fca7897e72f10717938fd2411d49ec7 WatchSource:0}: Error finding container c47196d45c4b51def5028759ff9bda193fca7897e72f10717938fd2411d49ec7: Status 404 returned error can't find the container with id c47196d45c4b51def5028759ff9bda193fca7897e72f10717938fd2411d49ec7 Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.684723 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.684757 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdf67b-df3f-4544-8c2f-08d0b2395149-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.771765 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.811140 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.822685 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:01 crc kubenswrapper[4773]: E0121 15:48:01.823160 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="ceilometer-notification-agent" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.823176 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="ceilometer-notification-agent" Jan 21 15:48:01 crc kubenswrapper[4773]: E0121 15:48:01.823197 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="proxy-httpd" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.823203 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="proxy-httpd" Jan 21 15:48:01 crc kubenswrapper[4773]: E0121 15:48:01.823218 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="ceilometer-central-agent" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.823224 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="ceilometer-central-agent" Jan 21 15:48:01 crc kubenswrapper[4773]: E0121 15:48:01.823238 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="sg-core" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.823244 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="sg-core" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.823405 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="sg-core" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.823425 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="ceilometer-central-agent" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.823442 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="ceilometer-notification-agent" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.823455 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" containerName="proxy-httpd" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.825163 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.827539 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.830760 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.861005 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.888251 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prlbx\" (UniqueName: \"kubernetes.io/projected/af63624d-c514-4d75-a541-5948d7981c1e-kube-api-access-prlbx\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.888588 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-config-data\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.888760 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.888915 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-run-httpd\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.889089 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-log-httpd\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.889365 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.889521 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-scripts\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.992152 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-log-httpd\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.992517 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.992584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-scripts\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.992613 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prlbx\" (UniqueName: \"kubernetes.io/projected/af63624d-c514-4d75-a541-5948d7981c1e-kube-api-access-prlbx\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.992709 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-log-httpd\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.992851 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-config-data\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.992893 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.992936 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-run-httpd\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.993824 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-run-httpd\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:01 crc kubenswrapper[4773]: I0121 15:48:01.996961 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:02 crc kubenswrapper[4773]: I0121 15:48:02.016193 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:02 crc kubenswrapper[4773]: I0121 15:48:02.018623 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-config-data\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:02 crc kubenswrapper[4773]: I0121 15:48:02.022586 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prlbx\" (UniqueName: \"kubernetes.io/projected/af63624d-c514-4d75-a541-5948d7981c1e-kube-api-access-prlbx\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:02 crc kubenswrapper[4773]: I0121 15:48:02.027538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-scripts\") pod \"ceilometer-0\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " pod="openstack/ceilometer-0" Jan 21 15:48:02 crc kubenswrapper[4773]: I0121 15:48:02.200712 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:02 crc kubenswrapper[4773]: I0121 15:48:02.481283 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5c8c084f-2abc-435a-80fc-e8101b086e50","Type":"ContainerStarted","Data":"c47196d45c4b51def5028759ff9bda193fca7897e72f10717938fd2411d49ec7"} Jan 21 15:48:02 crc kubenswrapper[4773]: I0121 15:48:02.685712 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:48:02 crc kubenswrapper[4773]: I0121 15:48:02.687592 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" Jan 21 15:48:02 crc kubenswrapper[4773]: I0121 15:48:02.886409 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:03 crc kubenswrapper[4773]: I0121 15:48:03.394878 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60bdf67b-df3f-4544-8c2f-08d0b2395149" path="/var/lib/kubelet/pods/60bdf67b-df3f-4544-8c2f-08d0b2395149/volumes" Jan 21 15:48:03 crc kubenswrapper[4773]: I0121 15:48:03.493386 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af63624d-c514-4d75-a541-5948d7981c1e","Type":"ContainerStarted","Data":"7451f28401156a2225b0bb6dc1cb5fd23171f9e280c23c31f857034ece3e30c3"} Jan 21 15:48:03 crc kubenswrapper[4773]: I0121 15:48:03.759398 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:48:03 crc kubenswrapper[4773]: I0121 15:48:03.760713 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:48:03 crc kubenswrapper[4773]: I0121 15:48:03.844128 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-hmr6x"] Jan 21 15:48:03 crc kubenswrapper[4773]: I0121 15:48:03.844421 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" podUID="b2696559-c843-4ec6-a347-f91ae2c790d3" containerName="dnsmasq-dns" containerID="cri-o://54db186a5a591064e5aeaaa5b58b527dc40d68aba19e9bc384d839b68250d7fd" gracePeriod=10 Jan 21 15:48:04 crc kubenswrapper[4773]: I0121 15:48:04.363196 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" podUID="b2696559-c843-4ec6-a347-f91ae2c790d3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.180:5353: connect: connection refused" Jan 21 15:48:04 crc kubenswrapper[4773]: I0121 15:48:04.504448 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2696559-c843-4ec6-a347-f91ae2c790d3" containerID="54db186a5a591064e5aeaaa5b58b527dc40d68aba19e9bc384d839b68250d7fd" exitCode=0 Jan 21 15:48:04 crc kubenswrapper[4773]: I0121 15:48:04.504520 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" event={"ID":"b2696559-c843-4ec6-a347-f91ae2c790d3","Type":"ContainerDied","Data":"54db186a5a591064e5aeaaa5b58b527dc40d68aba19e9bc384d839b68250d7fd"} Jan 21 15:48:06 crc kubenswrapper[4773]: I0121 15:48:06.580179 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5c8c084f-2abc-435a-80fc-e8101b086e50","Type":"ContainerStarted","Data":"b819b0df79dde662f66ab780e1cb1d0016e0728920782cd2b5fcd2fbef140f38"} Jan 21 15:48:07 crc kubenswrapper[4773]: I0121 15:48:07.507089 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="4453fa11-ade2-4d7d-a714-67525df64b70" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.190:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:48:07 crc kubenswrapper[4773]: I0121 15:48:07.597046 4773 generic.go:334] "Generic (PLEG): container finished" podID="ac95d669-f09a-43e9-a44e-088a7761fba8" containerID="9a449300cb7e533f053422bad57fbc68782fe26814f5b540861ff8092324479f" exitCode=0 Jan 21 15:48:07 crc kubenswrapper[4773]: I0121 15:48:07.597100 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"ac95d669-f09a-43e9-a44e-088a7761fba8","Type":"ContainerDied","Data":"9a449300cb7e533f053422bad57fbc68782fe26814f5b540861ff8092324479f"} Jan 21 15:48:10 crc kubenswrapper[4773]: I0121 15:48:10.113624 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 21 15:48:10 crc kubenswrapper[4773]: I0121 15:48:10.346588 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:13 crc kubenswrapper[4773]: E0121 15:48:13.532780 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Jan 21 15:48:13 crc kubenswrapper[4773]: E0121 15:48:13.533506 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n78hb9h5b5h57dh66h9fh5bfh5bdhb6h4h64fh5bfh5f6h579h66fh58dh645h65ch67h5f9hbfhffh694h559h687h558h5f8h55chb4h565h679h5c6q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_CA_CERT,Value:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4jcvv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(d34079f2-2d08-4ddc-8d49-a9afaadaba8c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 15:48:13 crc kubenswrapper[4773]: E0121 15:48:13.534927 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="d34079f2-2d08-4ddc-8d49-a9afaadaba8c" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.599090 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.668969 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.668961 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" event={"ID":"b2696559-c843-4ec6-a347-f91ae2c790d3","Type":"ContainerDied","Data":"b54f93c4f02dd14dcb6fbe7a97f4f8c0c08086996fd70bd207aaff2f7017af17"} Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.669544 4773 scope.go:117] "RemoveContainer" containerID="54db186a5a591064e5aeaaa5b58b527dc40d68aba19e9bc384d839b68250d7fd" Jan 21 15:48:13 crc kubenswrapper[4773]: E0121 15:48:13.743839 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="d34079f2-2d08-4ddc-8d49-a9afaadaba8c" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.774303 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-nb\") pod \"b2696559-c843-4ec6-a347-f91ae2c790d3\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.774391 4773 scope.go:117] "RemoveContainer" containerID="dd7f972c369b04e57e6c5940c12fadfc33324a7028f191963036eb5e831ceecb" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.774617 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-config\") pod \"b2696559-c843-4ec6-a347-f91ae2c790d3\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.774680 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm6gd\" (UniqueName: \"kubernetes.io/projected/b2696559-c843-4ec6-a347-f91ae2c790d3-kube-api-access-gm6gd\") pod \"b2696559-c843-4ec6-a347-f91ae2c790d3\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.774735 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-sb\") pod \"b2696559-c843-4ec6-a347-f91ae2c790d3\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.774782 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-swift-storage-0\") pod \"b2696559-c843-4ec6-a347-f91ae2c790d3\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.774933 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-svc\") pod \"b2696559-c843-4ec6-a347-f91ae2c790d3\" (UID: \"b2696559-c843-4ec6-a347-f91ae2c790d3\") " Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.817313 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2696559-c843-4ec6-a347-f91ae2c790d3-kube-api-access-gm6gd" (OuterVolumeSpecName: "kube-api-access-gm6gd") pod "b2696559-c843-4ec6-a347-f91ae2c790d3" (UID: "b2696559-c843-4ec6-a347-f91ae2c790d3"). InnerVolumeSpecName "kube-api-access-gm6gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.876512 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b2696559-c843-4ec6-a347-f91ae2c790d3" (UID: "b2696559-c843-4ec6-a347-f91ae2c790d3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.876944 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-config" (OuterVolumeSpecName: "config") pod "b2696559-c843-4ec6-a347-f91ae2c790d3" (UID: "b2696559-c843-4ec6-a347-f91ae2c790d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.876907 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b2696559-c843-4ec6-a347-f91ae2c790d3" (UID: "b2696559-c843-4ec6-a347-f91ae2c790d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.878150 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.878173 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm6gd\" (UniqueName: \"kubernetes.io/projected/b2696559-c843-4ec6-a347-f91ae2c790d3-kube-api-access-gm6gd\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.878185 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.878194 4773 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.894325 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b2696559-c843-4ec6-a347-f91ae2c790d3" (UID: "b2696559-c843-4ec6-a347-f91ae2c790d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.895029 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b2696559-c843-4ec6-a347-f91ae2c790d3" (UID: "b2696559-c843-4ec6-a347-f91ae2c790d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.979803 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:13 crc kubenswrapper[4773]: I0121 15:48:13.979840 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2696559-c843-4ec6-a347-f91ae2c790d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.016998 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-hmr6x"] Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.031085 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-hmr6x"] Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.031459 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.189466 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data-custom\") pod \"ac95d669-f09a-43e9-a44e-088a7761fba8\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.189614 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-scripts\") pod \"ac95d669-f09a-43e9-a44e-088a7761fba8\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.189677 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-certs\") pod \"ac95d669-f09a-43e9-a44e-088a7761fba8\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.189724 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mb74\" (UniqueName: \"kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-kube-api-access-4mb74\") pod \"ac95d669-f09a-43e9-a44e-088a7761fba8\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.189757 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data\") pod \"ac95d669-f09a-43e9-a44e-088a7761fba8\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.189879 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-combined-ca-bundle\") pod \"ac95d669-f09a-43e9-a44e-088a7761fba8\" (UID: \"ac95d669-f09a-43e9-a44e-088a7761fba8\") " Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.203836 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-scripts" (OuterVolumeSpecName: "scripts") pod "ac95d669-f09a-43e9-a44e-088a7761fba8" (UID: "ac95d669-f09a-43e9-a44e-088a7761fba8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.204023 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ac95d669-f09a-43e9-a44e-088a7761fba8" (UID: "ac95d669-f09a-43e9-a44e-088a7761fba8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.204074 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-kube-api-access-4mb74" (OuterVolumeSpecName: "kube-api-access-4mb74") pod "ac95d669-f09a-43e9-a44e-088a7761fba8" (UID: "ac95d669-f09a-43e9-a44e-088a7761fba8"). InnerVolumeSpecName "kube-api-access-4mb74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.214990 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-certs" (OuterVolumeSpecName: "certs") pod "ac95d669-f09a-43e9-a44e-088a7761fba8" (UID: "ac95d669-f09a-43e9-a44e-088a7761fba8"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.280855 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data" (OuterVolumeSpecName: "config-data") pod "ac95d669-f09a-43e9-a44e-088a7761fba8" (UID: "ac95d669-f09a-43e9-a44e-088a7761fba8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.292272 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.292310 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.292323 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.292334 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mb74\" (UniqueName: \"kubernetes.io/projected/ac95d669-f09a-43e9-a44e-088a7761fba8-kube-api-access-4mb74\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.292346 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.321884 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac95d669-f09a-43e9-a44e-088a7761fba8" (UID: "ac95d669-f09a-43e9-a44e-088a7761fba8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.361853 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-85ff748b95-hmr6x" podUID="b2696559-c843-4ec6-a347-f91ae2c790d3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.180:5353: i/o timeout" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.394582 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac95d669-f09a-43e9-a44e-088a7761fba8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.682829 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"ac95d669-f09a-43e9-a44e-088a7761fba8","Type":"ContainerDied","Data":"6451e01cac51a1712392f085cf104b6af0eef5a5b90dcf5f6aca588261daf8c8"} Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.682901 4773 scope.go:117] "RemoveContainer" containerID="9a449300cb7e533f053422bad57fbc68782fe26814f5b540861ff8092324479f" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.683069 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.726171 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.748219 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.758814 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:48:14 crc kubenswrapper[4773]: E0121 15:48:14.759235 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2696559-c843-4ec6-a347-f91ae2c790d3" containerName="init" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.759251 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2696559-c843-4ec6-a347-f91ae2c790d3" containerName="init" Jan 21 15:48:14 crc kubenswrapper[4773]: E0121 15:48:14.759266 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2696559-c843-4ec6-a347-f91ae2c790d3" containerName="dnsmasq-dns" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.759273 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2696559-c843-4ec6-a347-f91ae2c790d3" containerName="dnsmasq-dns" Jan 21 15:48:14 crc kubenswrapper[4773]: E0121 15:48:14.759294 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac95d669-f09a-43e9-a44e-088a7761fba8" containerName="cloudkitty-proc" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.759302 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac95d669-f09a-43e9-a44e-088a7761fba8" containerName="cloudkitty-proc" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.759475 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac95d669-f09a-43e9-a44e-088a7761fba8" containerName="cloudkitty-proc" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.759494 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2696559-c843-4ec6-a347-f91ae2c790d3" containerName="dnsmasq-dns" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.760405 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.767426 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.773368 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.905107 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.905614 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.905745 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-scripts\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.905793 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6chpr\" (UniqueName: \"kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-kube-api-access-6chpr\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.905843 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-certs\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:14 crc kubenswrapper[4773]: I0121 15:48:14.905865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.007939 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-certs\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.007989 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.008086 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.008128 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.008183 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-scripts\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.008209 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6chpr\" (UniqueName: \"kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-kube-api-access-6chpr\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.014218 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-scripts\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.020464 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.020650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.022364 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-certs\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.023459 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.025117 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6chpr\" (UniqueName: \"kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-kube-api-access-6chpr\") pod \"cloudkitty-proc-0\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.081748 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.397561 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac95d669-f09a-43e9-a44e-088a7761fba8" path="/var/lib/kubelet/pods/ac95d669-f09a-43e9-a44e-088a7761fba8/volumes" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.398371 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2696559-c843-4ec6-a347-f91ae2c790d3" path="/var/lib/kubelet/pods/b2696559-c843-4ec6-a347-f91ae2c790d3/volumes" Jan 21 15:48:15 crc kubenswrapper[4773]: W0121 15:48:15.603565 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c718710_7612_4e1f_b166_4c031c7051da.slice/crio-2588bc98d01b1289e925af2999552b1f558973b3fd34b6ffb6b2f4cd81a8fae8 WatchSource:0}: Error finding container 2588bc98d01b1289e925af2999552b1f558973b3fd34b6ffb6b2f4cd81a8fae8: Status 404 returned error can't find the container with id 2588bc98d01b1289e925af2999552b1f558973b3fd34b6ffb6b2f4cd81a8fae8 Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.606764 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.694047 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"6c718710-7612-4e1f-b166-4c031c7051da","Type":"ContainerStarted","Data":"2588bc98d01b1289e925af2999552b1f558973b3fd34b6ffb6b2f4cd81a8fae8"} Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.696108 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5c8c084f-2abc-435a-80fc-e8101b086e50","Type":"ContainerStarted","Data":"7610d9685c75ea02e62138921ecc96c5c1240dbc35ee4b6c6104fb77b86b33ea"} Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.703499 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af63624d-c514-4d75-a541-5948d7981c1e","Type":"ContainerStarted","Data":"95db7b433113bab874aa46ede8c92d9d521d5215695f767e19829c83a738dc12"} Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.723914 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=15.723890525 podStartE2EDuration="15.723890525s" podCreationTimestamp="2026-01-21 15:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:15.714356608 +0000 UTC m=+1460.638846240" watchObservedRunningTime="2026-01-21 15:48:15.723890525 +0000 UTC m=+1460.648380147" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.862429 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 21 15:48:15 crc kubenswrapper[4773]: I0121 15:48:15.864486 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="5c8c084f-2abc-435a-80fc-e8101b086e50" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.195:8080/\": dial tcp 10.217.0.195:8080: connect: connection refused" Jan 21 15:48:17 crc kubenswrapper[4773]: I0121 15:48:17.725537 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"6c718710-7612-4e1f-b166-4c031c7051da","Type":"ContainerStarted","Data":"9eab6517396c2473e4608a5bebc694409a9bf9f73514a98ff5cba3407e2404e7"} Jan 21 15:48:17 crc kubenswrapper[4773]: I0121 15:48:17.749569 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=3.749551012 podStartE2EDuration="3.749551012s" podCreationTimestamp="2026-01-21 15:48:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:17.74502095 +0000 UTC m=+1462.669510572" watchObservedRunningTime="2026-01-21 15:48:17.749551012 +0000 UTC m=+1462.674040634" Jan 21 15:48:20 crc kubenswrapper[4773]: I0121 15:48:20.753232 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af63624d-c514-4d75-a541-5948d7981c1e","Type":"ContainerStarted","Data":"a70d3616aa53a047fd8525d757af6e3e86b9d201e005e7c7056219fc27b29a9e"} Jan 21 15:48:21 crc kubenswrapper[4773]: I0121 15:48:21.057827 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 21 15:48:23 crc kubenswrapper[4773]: I0121 15:48:23.807672 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="32888aa3-cb52-484f-9745-5d5dfc5179df" containerName="galera" probeResult="failure" output="command timed out" Jan 21 15:48:23 crc kubenswrapper[4773]: I0121 15:48:23.807714 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="32888aa3-cb52-484f-9745-5d5dfc5179df" containerName="galera" probeResult="failure" output="command timed out" Jan 21 15:48:25 crc kubenswrapper[4773]: I0121 15:48:25.807554 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="869ad9c0-3593-4ebc-9b58-7b9615e46927" containerName="galera" probeResult="failure" output="command timed out" Jan 21 15:48:25 crc kubenswrapper[4773]: I0121 15:48:25.807914 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="869ad9c0-3593-4ebc-9b58-7b9615e46927" containerName="galera" probeResult="failure" output="command timed out" Jan 21 15:48:28 crc kubenswrapper[4773]: I0121 15:48:28.841526 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4511970-2daa-41c2-b649-96144b875bee" containerID="4942c0cf13267cc8db102e037b15fcef87ecbce9be74eecb9897e0ebb90e4863" exitCode=137 Jan 21 15:48:28 crc kubenswrapper[4773]: I0121 15:48:28.842076 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c4511970-2daa-41c2-b649-96144b875bee","Type":"ContainerDied","Data":"4942c0cf13267cc8db102e037b15fcef87ecbce9be74eecb9897e0ebb90e4863"} Jan 21 15:48:28 crc kubenswrapper[4773]: I0121 15:48:28.987967 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="c4511970-2daa-41c2-b649-96144b875bee" containerName="cloudkitty-api" probeResult="failure" output="Get \"http://10.217.0.194:8889/healthcheck\": dial tcp 10.217.0.194:8889: connect: connection refused" Jan 21 15:48:28 crc kubenswrapper[4773]: I0121 15:48:28.997499 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-ss2cc"] Jan 21 15:48:28 crc kubenswrapper[4773]: I0121 15:48:28.998808 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ss2cc" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.013300 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ss2cc"] Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.071326 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-p97m2"] Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.073009 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p97m2" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.088656 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e686b20-33ac-474d-a46b-ea308b32cbf3-operator-scripts\") pod \"nova-api-db-create-ss2cc\" (UID: \"4e686b20-33ac-474d-a46b-ea308b32cbf3\") " pod="openstack/nova-api-db-create-ss2cc" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.088903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9whvb\" (UniqueName: \"kubernetes.io/projected/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-kube-api-access-9whvb\") pod \"nova-cell0-db-create-p97m2\" (UID: \"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41\") " pod="openstack/nova-cell0-db-create-p97m2" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.088946 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-operator-scripts\") pod \"nova-cell0-db-create-p97m2\" (UID: \"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41\") " pod="openstack/nova-cell0-db-create-p97m2" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.089127 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw99b\" (UniqueName: \"kubernetes.io/projected/4e686b20-33ac-474d-a46b-ea308b32cbf3-kube-api-access-dw99b\") pod \"nova-api-db-create-ss2cc\" (UID: \"4e686b20-33ac-474d-a46b-ea308b32cbf3\") " pod="openstack/nova-api-db-create-ss2cc" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.128763 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p97m2"] Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.143233 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-40ec-account-create-update-8t8mw"] Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.145827 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-40ec-account-create-update-8t8mw" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.150127 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.181512 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-40ec-account-create-update-8t8mw"] Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.191361 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9whvb\" (UniqueName: \"kubernetes.io/projected/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-kube-api-access-9whvb\") pod \"nova-cell0-db-create-p97m2\" (UID: \"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41\") " pod="openstack/nova-cell0-db-create-p97m2" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.191415 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-operator-scripts\") pod \"nova-cell0-db-create-p97m2\" (UID: \"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41\") " pod="openstack/nova-cell0-db-create-p97m2" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.191457 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvhh2\" (UniqueName: \"kubernetes.io/projected/5bfe59ba-7b63-4e75-825b-05b03663557b-kube-api-access-vvhh2\") pod \"nova-api-40ec-account-create-update-8t8mw\" (UID: \"5bfe59ba-7b63-4e75-825b-05b03663557b\") " pod="openstack/nova-api-40ec-account-create-update-8t8mw" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.191479 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bfe59ba-7b63-4e75-825b-05b03663557b-operator-scripts\") pod \"nova-api-40ec-account-create-update-8t8mw\" (UID: \"5bfe59ba-7b63-4e75-825b-05b03663557b\") " pod="openstack/nova-api-40ec-account-create-update-8t8mw" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.191519 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw99b\" (UniqueName: \"kubernetes.io/projected/4e686b20-33ac-474d-a46b-ea308b32cbf3-kube-api-access-dw99b\") pod \"nova-api-db-create-ss2cc\" (UID: \"4e686b20-33ac-474d-a46b-ea308b32cbf3\") " pod="openstack/nova-api-db-create-ss2cc" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.191780 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e686b20-33ac-474d-a46b-ea308b32cbf3-operator-scripts\") pod \"nova-api-db-create-ss2cc\" (UID: \"4e686b20-33ac-474d-a46b-ea308b32cbf3\") " pod="openstack/nova-api-db-create-ss2cc" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.192495 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-operator-scripts\") pod \"nova-cell0-db-create-p97m2\" (UID: \"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41\") " pod="openstack/nova-cell0-db-create-p97m2" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.192594 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e686b20-33ac-474d-a46b-ea308b32cbf3-operator-scripts\") pod \"nova-api-db-create-ss2cc\" (UID: \"4e686b20-33ac-474d-a46b-ea308b32cbf3\") " pod="openstack/nova-api-db-create-ss2cc" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.214994 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw99b\" (UniqueName: \"kubernetes.io/projected/4e686b20-33ac-474d-a46b-ea308b32cbf3-kube-api-access-dw99b\") pod \"nova-api-db-create-ss2cc\" (UID: \"4e686b20-33ac-474d-a46b-ea308b32cbf3\") " pod="openstack/nova-api-db-create-ss2cc" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.232331 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9whvb\" (UniqueName: \"kubernetes.io/projected/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-kube-api-access-9whvb\") pod \"nova-cell0-db-create-p97m2\" (UID: \"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41\") " pod="openstack/nova-cell0-db-create-p97m2" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.255446 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-k9sdj"] Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.257300 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k9sdj" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.271273 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k9sdj"] Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.294189 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f53fe25-e64e-4570-989f-ae65e7e23a71-operator-scripts\") pod \"nova-cell1-db-create-k9sdj\" (UID: \"3f53fe25-e64e-4570-989f-ae65e7e23a71\") " pod="openstack/nova-cell1-db-create-k9sdj" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.294497 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvhh2\" (UniqueName: \"kubernetes.io/projected/5bfe59ba-7b63-4e75-825b-05b03663557b-kube-api-access-vvhh2\") pod \"nova-api-40ec-account-create-update-8t8mw\" (UID: \"5bfe59ba-7b63-4e75-825b-05b03663557b\") " pod="openstack/nova-api-40ec-account-create-update-8t8mw" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.294595 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bfe59ba-7b63-4e75-825b-05b03663557b-operator-scripts\") pod \"nova-api-40ec-account-create-update-8t8mw\" (UID: \"5bfe59ba-7b63-4e75-825b-05b03663557b\") " pod="openstack/nova-api-40ec-account-create-update-8t8mw" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.294771 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt9zr\" (UniqueName: \"kubernetes.io/projected/3f53fe25-e64e-4570-989f-ae65e7e23a71-kube-api-access-pt9zr\") pod \"nova-cell1-db-create-k9sdj\" (UID: \"3f53fe25-e64e-4570-989f-ae65e7e23a71\") " pod="openstack/nova-cell1-db-create-k9sdj" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.295972 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bfe59ba-7b63-4e75-825b-05b03663557b-operator-scripts\") pod \"nova-api-40ec-account-create-update-8t8mw\" (UID: \"5bfe59ba-7b63-4e75-825b-05b03663557b\") " pod="openstack/nova-api-40ec-account-create-update-8t8mw" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.295992 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d93f-account-create-update-6wdd5"] Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.297385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.303202 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.315148 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvhh2\" (UniqueName: \"kubernetes.io/projected/5bfe59ba-7b63-4e75-825b-05b03663557b-kube-api-access-vvhh2\") pod \"nova-api-40ec-account-create-update-8t8mw\" (UID: \"5bfe59ba-7b63-4e75-825b-05b03663557b\") " pod="openstack/nova-api-40ec-account-create-update-8t8mw" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.316444 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d93f-account-create-update-6wdd5"] Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.325355 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ss2cc" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.410261 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p97m2" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.459116 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt9zr\" (UniqueName: \"kubernetes.io/projected/3f53fe25-e64e-4570-989f-ae65e7e23a71-kube-api-access-pt9zr\") pod \"nova-cell1-db-create-k9sdj\" (UID: \"3f53fe25-e64e-4570-989f-ae65e7e23a71\") " pod="openstack/nova-cell1-db-create-k9sdj" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.460021 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh5zl\" (UniqueName: \"kubernetes.io/projected/dede4ee8-95b2-4feb-bedb-4fb9615014f4-kube-api-access-kh5zl\") pod \"nova-cell0-d93f-account-create-update-6wdd5\" (UID: \"dede4ee8-95b2-4feb-bedb-4fb9615014f4\") " pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.460285 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dede4ee8-95b2-4feb-bedb-4fb9615014f4-operator-scripts\") pod \"nova-cell0-d93f-account-create-update-6wdd5\" (UID: \"dede4ee8-95b2-4feb-bedb-4fb9615014f4\") " pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.461020 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f53fe25-e64e-4570-989f-ae65e7e23a71-operator-scripts\") pod \"nova-cell1-db-create-k9sdj\" (UID: \"3f53fe25-e64e-4570-989f-ae65e7e23a71\") " pod="openstack/nova-cell1-db-create-k9sdj" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.531382 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f53fe25-e64e-4570-989f-ae65e7e23a71-operator-scripts\") pod \"nova-cell1-db-create-k9sdj\" (UID: \"3f53fe25-e64e-4570-989f-ae65e7e23a71\") " pod="openstack/nova-cell1-db-create-k9sdj" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.531535 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-40ec-account-create-update-8t8mw" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.566239 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt9zr\" (UniqueName: \"kubernetes.io/projected/3f53fe25-e64e-4570-989f-ae65e7e23a71-kube-api-access-pt9zr\") pod \"nova-cell1-db-create-k9sdj\" (UID: \"3f53fe25-e64e-4570-989f-ae65e7e23a71\") " pod="openstack/nova-cell1-db-create-k9sdj" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.567534 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh5zl\" (UniqueName: \"kubernetes.io/projected/dede4ee8-95b2-4feb-bedb-4fb9615014f4-kube-api-access-kh5zl\") pod \"nova-cell0-d93f-account-create-update-6wdd5\" (UID: \"dede4ee8-95b2-4feb-bedb-4fb9615014f4\") " pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.567577 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dede4ee8-95b2-4feb-bedb-4fb9615014f4-operator-scripts\") pod \"nova-cell0-d93f-account-create-update-6wdd5\" (UID: \"dede4ee8-95b2-4feb-bedb-4fb9615014f4\") " pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.569799 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dede4ee8-95b2-4feb-bedb-4fb9615014f4-operator-scripts\") pod \"nova-cell0-d93f-account-create-update-6wdd5\" (UID: \"dede4ee8-95b2-4feb-bedb-4fb9615014f4\") " pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.594257 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh5zl\" (UniqueName: \"kubernetes.io/projected/dede4ee8-95b2-4feb-bedb-4fb9615014f4-kube-api-access-kh5zl\") pod \"nova-cell0-d93f-account-create-update-6wdd5\" (UID: \"dede4ee8-95b2-4feb-bedb-4fb9615014f4\") " pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.595190 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k9sdj" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.628076 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f175-account-create-update-rdlpn"] Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.629004 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.629403 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f175-account-create-update-rdlpn" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.637070 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.671500 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/057373ae-c81b-4e2b-a71c-aa81a62c4465-operator-scripts\") pod \"nova-cell1-f175-account-create-update-rdlpn\" (UID: \"057373ae-c81b-4e2b-a71c-aa81a62c4465\") " pod="openstack/nova-cell1-f175-account-create-update-rdlpn" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.671999 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxfzs\" (UniqueName: \"kubernetes.io/projected/057373ae-c81b-4e2b-a71c-aa81a62c4465-kube-api-access-bxfzs\") pod \"nova-cell1-f175-account-create-update-rdlpn\" (UID: \"057373ae-c81b-4e2b-a71c-aa81a62c4465\") " pod="openstack/nova-cell1-f175-account-create-update-rdlpn" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.685778 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f175-account-create-update-rdlpn"] Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.773931 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxfzs\" (UniqueName: \"kubernetes.io/projected/057373ae-c81b-4e2b-a71c-aa81a62c4465-kube-api-access-bxfzs\") pod \"nova-cell1-f175-account-create-update-rdlpn\" (UID: \"057373ae-c81b-4e2b-a71c-aa81a62c4465\") " pod="openstack/nova-cell1-f175-account-create-update-rdlpn" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.774289 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/057373ae-c81b-4e2b-a71c-aa81a62c4465-operator-scripts\") pod \"nova-cell1-f175-account-create-update-rdlpn\" (UID: \"057373ae-c81b-4e2b-a71c-aa81a62c4465\") " pod="openstack/nova-cell1-f175-account-create-update-rdlpn" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.775990 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/057373ae-c81b-4e2b-a71c-aa81a62c4465-operator-scripts\") pod \"nova-cell1-f175-account-create-update-rdlpn\" (UID: \"057373ae-c81b-4e2b-a71c-aa81a62c4465\") " pod="openstack/nova-cell1-f175-account-create-update-rdlpn" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.804833 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxfzs\" (UniqueName: \"kubernetes.io/projected/057373ae-c81b-4e2b-a71c-aa81a62c4465-kube-api-access-bxfzs\") pod \"nova-cell1-f175-account-create-update-rdlpn\" (UID: \"057373ae-c81b-4e2b-a71c-aa81a62c4465\") " pod="openstack/nova-cell1-f175-account-create-update-rdlpn" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.888919 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.969308 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f175-account-create-update-rdlpn" Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.991923 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-certs\") pod \"c4511970-2daa-41c2-b649-96144b875bee\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.992016 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data-custom\") pod \"c4511970-2daa-41c2-b649-96144b875bee\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.992068 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-scripts\") pod \"c4511970-2daa-41c2-b649-96144b875bee\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.992100 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74k2w\" (UniqueName: \"kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-kube-api-access-74k2w\") pod \"c4511970-2daa-41c2-b649-96144b875bee\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.992278 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4511970-2daa-41c2-b649-96144b875bee-logs\") pod \"c4511970-2daa-41c2-b649-96144b875bee\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.992332 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data\") pod \"c4511970-2daa-41c2-b649-96144b875bee\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.992428 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-combined-ca-bundle\") pod \"c4511970-2daa-41c2-b649-96144b875bee\" (UID: \"c4511970-2daa-41c2-b649-96144b875bee\") " Jan 21 15:48:29 crc kubenswrapper[4773]: I0121 15:48:29.996819 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4511970-2daa-41c2-b649-96144b875bee-logs" (OuterVolumeSpecName: "logs") pod "c4511970-2daa-41c2-b649-96144b875bee" (UID: "c4511970-2daa-41c2-b649-96144b875bee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.000265 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c4511970-2daa-41c2-b649-96144b875bee" (UID: "c4511970-2daa-41c2-b649-96144b875bee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.003477 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-certs" (OuterVolumeSpecName: "certs") pod "c4511970-2daa-41c2-b649-96144b875bee" (UID: "c4511970-2daa-41c2-b649-96144b875bee"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.008951 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-scripts" (OuterVolumeSpecName: "scripts") pod "c4511970-2daa-41c2-b649-96144b875bee" (UID: "c4511970-2daa-41c2-b649-96144b875bee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.009006 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-kube-api-access-74k2w" (OuterVolumeSpecName: "kube-api-access-74k2w") pod "c4511970-2daa-41c2-b649-96144b875bee" (UID: "c4511970-2daa-41c2-b649-96144b875bee"). InnerVolumeSpecName "kube-api-access-74k2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.041241 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4511970-2daa-41c2-b649-96144b875bee" (UID: "c4511970-2daa-41c2-b649-96144b875bee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.072509 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data" (OuterVolumeSpecName: "config-data") pod "c4511970-2daa-41c2-b649-96144b875bee" (UID: "c4511970-2daa-41c2-b649-96144b875bee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.096935 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4511970-2daa-41c2-b649-96144b875bee-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.096969 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.096978 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.096988 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.096997 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.097005 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4511970-2daa-41c2-b649-96144b875bee-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.097013 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74k2w\" (UniqueName: \"kubernetes.io/projected/c4511970-2daa-41c2-b649-96144b875bee-kube-api-access-74k2w\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.264109 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ss2cc"] Jan 21 15:48:30 crc kubenswrapper[4773]: W0121 15:48:30.391590 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddede4ee8_95b2_4feb_bedb_4fb9615014f4.slice/crio-4c50a745e3d4514699ca926259c9065f1937bf6d35bac8948584791ed6e0badd WatchSource:0}: Error finding container 4c50a745e3d4514699ca926259c9065f1937bf6d35bac8948584791ed6e0badd: Status 404 returned error can't find the container with id 4c50a745e3d4514699ca926259c9065f1937bf6d35bac8948584791ed6e0badd Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.393355 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d93f-account-create-update-6wdd5"] Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.740966 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k9sdj"] Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.764181 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-40ec-account-create-update-8t8mw"] Jan 21 15:48:30 crc kubenswrapper[4773]: W0121 15:48:30.778997 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bfe59ba_7b63_4e75_825b_05b03663557b.slice/crio-c961bd2c0275ebe6bdb6b19f673b52f42278a12a8b8be84847b06a8558c9840c WatchSource:0}: Error finding container c961bd2c0275ebe6bdb6b19f673b52f42278a12a8b8be84847b06a8558c9840c: Status 404 returned error can't find the container with id c961bd2c0275ebe6bdb6b19f673b52f42278a12a8b8be84847b06a8558c9840c Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.795373 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p97m2"] Jan 21 15:48:30 crc kubenswrapper[4773]: W0121 15:48:30.807797 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6abc2d7_7db5_4e83_9aa3_e8beb591cb41.slice/crio-ca52b6c359e4ebc87893029eda712f64b4563f3945b2781721e0837cfbd79b62 WatchSource:0}: Error finding container ca52b6c359e4ebc87893029eda712f64b4563f3945b2781721e0837cfbd79b62: Status 404 returned error can't find the container with id ca52b6c359e4ebc87893029eda712f64b4563f3945b2781721e0837cfbd79b62 Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.811496 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f175-account-create-update-rdlpn"] Jan 21 15:48:30 crc kubenswrapper[4773]: W0121 15:48:30.820340 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod057373ae_c81b_4e2b_a71c_aa81a62c4465.slice/crio-9ad64c70949ee14e8b46725c6b8620b7b63794112e12090057f02ff6f71ee170 WatchSource:0}: Error finding container 9ad64c70949ee14e8b46725c6b8620b7b63794112e12090057f02ff6f71ee170: Status 404 returned error can't find the container with id 9ad64c70949ee14e8b46725c6b8620b7b63794112e12090057f02ff6f71ee170 Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.914939 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ss2cc" event={"ID":"4e686b20-33ac-474d-a46b-ea308b32cbf3","Type":"ContainerStarted","Data":"49825e98183e092d50e9a9e453894d403376a0dd0b5a334c62c57a25f6f9d892"} Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.916736 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" event={"ID":"dede4ee8-95b2-4feb-bedb-4fb9615014f4","Type":"ContainerStarted","Data":"4c50a745e3d4514699ca926259c9065f1937bf6d35bac8948584791ed6e0badd"} Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.918191 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f175-account-create-update-rdlpn" event={"ID":"057373ae-c81b-4e2b-a71c-aa81a62c4465","Type":"ContainerStarted","Data":"9ad64c70949ee14e8b46725c6b8620b7b63794112e12090057f02ff6f71ee170"} Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.922422 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-40ec-account-create-update-8t8mw" event={"ID":"5bfe59ba-7b63-4e75-825b-05b03663557b","Type":"ContainerStarted","Data":"c961bd2c0275ebe6bdb6b19f673b52f42278a12a8b8be84847b06a8558c9840c"} Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.924679 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c4511970-2daa-41c2-b649-96144b875bee","Type":"ContainerDied","Data":"c2519f55abe4b3833965797aaf1fe76f7b58d5498502239a59b55e4a3f520015"} Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.924741 4773 scope.go:117] "RemoveContainer" containerID="4942c0cf13267cc8db102e037b15fcef87ecbce9be74eecb9897e0ebb90e4863" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.924748 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.934216 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p97m2" event={"ID":"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41","Type":"ContainerStarted","Data":"ca52b6c359e4ebc87893029eda712f64b4563f3945b2781721e0837cfbd79b62"} Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.937119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k9sdj" event={"ID":"3f53fe25-e64e-4570-989f-ae65e7e23a71","Type":"ContainerStarted","Data":"2e6bee84243578d87447295e65b19ac97c05290da3394d36f9b0e4e0264bd68d"} Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.966887 4773 scope.go:117] "RemoveContainer" containerID="f901320deee653211c9902ad31a24604874f479a98d460695a145bf0ad670cdb" Jan 21 15:48:30 crc kubenswrapper[4773]: I0121 15:48:30.988807 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.016953 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.042576 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:48:31 crc kubenswrapper[4773]: E0121 15:48:31.043165 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4511970-2daa-41c2-b649-96144b875bee" containerName="cloudkitty-api-log" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.043190 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4511970-2daa-41c2-b649-96144b875bee" containerName="cloudkitty-api-log" Jan 21 15:48:31 crc kubenswrapper[4773]: E0121 15:48:31.043225 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4511970-2daa-41c2-b649-96144b875bee" containerName="cloudkitty-api" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.043235 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4511970-2daa-41c2-b649-96144b875bee" containerName="cloudkitty-api" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.043445 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4511970-2daa-41c2-b649-96144b875bee" containerName="cloudkitty-api" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.043489 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4511970-2daa-41c2-b649-96144b875bee" containerName="cloudkitty-api-log" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.044763 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.057604 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.057673 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.058012 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.063132 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.120055 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.120379 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7bac139-85d2-4d70-b755-22c0e0e8fa92-logs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.120498 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-certs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.120598 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.120683 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.120984 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-scripts\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.121087 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.125810 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.126017 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58c8x\" (UniqueName: \"kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-kube-api-access-58c8x\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.228124 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.228252 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7bac139-85d2-4d70-b755-22c0e0e8fa92-logs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.228818 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7bac139-85d2-4d70-b755-22c0e0e8fa92-logs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.228897 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-certs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.229484 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.229525 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.230192 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-scripts\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.230229 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.230264 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.230303 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58c8x\" (UniqueName: \"kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-kube-api-access-58c8x\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.236403 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.238476 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-scripts\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.241313 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.242093 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.242907 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.242959 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-certs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.246140 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.253728 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58c8x\" (UniqueName: \"kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-kube-api-access-58c8x\") pod \"cloudkitty-api-0\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.377082 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.405238 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4511970-2daa-41c2-b649-96144b875bee" path="/var/lib/kubelet/pods/c4511970-2daa-41c2-b649-96144b875bee/volumes" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.949015 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ss2cc" event={"ID":"4e686b20-33ac-474d-a46b-ea308b32cbf3","Type":"ContainerStarted","Data":"d8f84c5d8bfb76a047741159ac1ae3f7705824fb8d50f6fa096960f26cc116ce"} Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.951552 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" event={"ID":"dede4ee8-95b2-4feb-bedb-4fb9615014f4","Type":"ContainerStarted","Data":"6be08ef4f36b8ad74ccf236c93896982efedc5d5f17b0cdaef5550cf94a567fa"} Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.953370 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f175-account-create-update-rdlpn" event={"ID":"057373ae-c81b-4e2b-a71c-aa81a62c4465","Type":"ContainerStarted","Data":"535d1a60cdad8c51a446fe9407095054f6e60a7c07816c48521ee6f3b97260b2"} Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.955532 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-40ec-account-create-update-8t8mw" event={"ID":"5bfe59ba-7b63-4e75-825b-05b03663557b","Type":"ContainerStarted","Data":"abf92234ffc1d745ba6bda73402cfe734e11c2572ac7043a6cf0fc19945b7df5"} Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.958222 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p97m2" event={"ID":"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41","Type":"ContainerStarted","Data":"c59894bbd3d82a699d43d1d37f0de0562c8b8a982700a5f942370c63309692ed"} Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.960237 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k9sdj" event={"ID":"3f53fe25-e64e-4570-989f-ae65e7e23a71","Type":"ContainerStarted","Data":"6bbcaafb0d2a1ffe8ce7624a1a952fa65c970ab2d42da71b43fe6beffa4955e7"} Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.969893 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-ss2cc" podStartSLOduration=3.96987842 podStartE2EDuration="3.96987842s" podCreationTimestamp="2026-01-21 15:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:31.966231972 +0000 UTC m=+1476.890721594" watchObservedRunningTime="2026-01-21 15:48:31.96987842 +0000 UTC m=+1476.894368042" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.988837 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-p97m2" podStartSLOduration=2.988814889 podStartE2EDuration="2.988814889s" podCreationTimestamp="2026-01-21 15:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:31.978513512 +0000 UTC m=+1476.903003134" watchObservedRunningTime="2026-01-21 15:48:31.988814889 +0000 UTC m=+1476.913304521" Jan 21 15:48:31 crc kubenswrapper[4773]: I0121 15:48:31.999474 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" podStartSLOduration=2.999451085 podStartE2EDuration="2.999451085s" podCreationTimestamp="2026-01-21 15:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:31.992867238 +0000 UTC m=+1476.917356860" watchObservedRunningTime="2026-01-21 15:48:31.999451085 +0000 UTC m=+1476.923940707" Jan 21 15:48:32 crc kubenswrapper[4773]: I0121 15:48:32.024492 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-40ec-account-create-update-8t8mw" podStartSLOduration=3.024472548 podStartE2EDuration="3.024472548s" podCreationTimestamp="2026-01-21 15:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:32.00819224 +0000 UTC m=+1476.932681892" watchObservedRunningTime="2026-01-21 15:48:32.024472548 +0000 UTC m=+1476.948962170" Jan 21 15:48:32 crc kubenswrapper[4773]: I0121 15:48:32.031818 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-f175-account-create-update-rdlpn" podStartSLOduration=3.031800435 podStartE2EDuration="3.031800435s" podCreationTimestamp="2026-01-21 15:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:32.026438251 +0000 UTC m=+1476.950927893" watchObservedRunningTime="2026-01-21 15:48:32.031800435 +0000 UTC m=+1476.956290057" Jan 21 15:48:36 crc kubenswrapper[4773]: I0121 15:48:36.067814 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-k9sdj" podStartSLOduration=7.067797266 podStartE2EDuration="7.067797266s" podCreationTimestamp="2026-01-21 15:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:32.045726319 +0000 UTC m=+1476.970215941" watchObservedRunningTime="2026-01-21 15:48:36.067797266 +0000 UTC m=+1480.992286888" Jan 21 15:48:36 crc kubenswrapper[4773]: I0121 15:48:36.076004 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:48:36 crc kubenswrapper[4773]: W0121 15:48:36.434882 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7bac139_85d2_4d70_b755_22c0e0e8fa92.slice/crio-84151022ca64a50747c5455f63a6f683dca3e43983630576461bc75521bcab90 WatchSource:0}: Error finding container 84151022ca64a50747c5455f63a6f683dca3e43983630576461bc75521bcab90: Status 404 returned error can't find the container with id 84151022ca64a50747c5455f63a6f683dca3e43983630576461bc75521bcab90 Jan 21 15:48:37 crc kubenswrapper[4773]: I0121 15:48:37.005949 4773 generic.go:334] "Generic (PLEG): container finished" podID="3f53fe25-e64e-4570-989f-ae65e7e23a71" containerID="6bbcaafb0d2a1ffe8ce7624a1a952fa65c970ab2d42da71b43fe6beffa4955e7" exitCode=0 Jan 21 15:48:37 crc kubenswrapper[4773]: I0121 15:48:37.006359 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k9sdj" event={"ID":"3f53fe25-e64e-4570-989f-ae65e7e23a71","Type":"ContainerDied","Data":"6bbcaafb0d2a1ffe8ce7624a1a952fa65c970ab2d42da71b43fe6beffa4955e7"} Jan 21 15:48:37 crc kubenswrapper[4773]: I0121 15:48:37.009204 4773 generic.go:334] "Generic (PLEG): container finished" podID="4e686b20-33ac-474d-a46b-ea308b32cbf3" containerID="d8f84c5d8bfb76a047741159ac1ae3f7705824fb8d50f6fa096960f26cc116ce" exitCode=0 Jan 21 15:48:37 crc kubenswrapper[4773]: I0121 15:48:37.009292 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ss2cc" event={"ID":"4e686b20-33ac-474d-a46b-ea308b32cbf3","Type":"ContainerDied","Data":"d8f84c5d8bfb76a047741159ac1ae3f7705824fb8d50f6fa096960f26cc116ce"} Jan 21 15:48:37 crc kubenswrapper[4773]: I0121 15:48:37.010669 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c7bac139-85d2-4d70-b755-22c0e0e8fa92","Type":"ContainerStarted","Data":"84151022ca64a50747c5455f63a6f683dca3e43983630576461bc75521bcab90"} Jan 21 15:48:37 crc kubenswrapper[4773]: I0121 15:48:37.012811 4773 generic.go:334] "Generic (PLEG): container finished" podID="057373ae-c81b-4e2b-a71c-aa81a62c4465" containerID="535d1a60cdad8c51a446fe9407095054f6e60a7c07816c48521ee6f3b97260b2" exitCode=0 Jan 21 15:48:37 crc kubenswrapper[4773]: I0121 15:48:37.012896 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f175-account-create-update-rdlpn" event={"ID":"057373ae-c81b-4e2b-a71c-aa81a62c4465","Type":"ContainerDied","Data":"535d1a60cdad8c51a446fe9407095054f6e60a7c07816c48521ee6f3b97260b2"} Jan 21 15:48:37 crc kubenswrapper[4773]: I0121 15:48:37.014405 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af63624d-c514-4d75-a541-5948d7981c1e","Type":"ContainerStarted","Data":"2910f05f3469fa93ab094db87835eed9154ff593e953e2fe159bd5bbfd6c2379"} Jan 21 15:48:37 crc kubenswrapper[4773]: I0121 15:48:37.017099 4773 generic.go:334] "Generic (PLEG): container finished" podID="e6abc2d7-7db5-4e83-9aa3-e8beb591cb41" containerID="c59894bbd3d82a699d43d1d37f0de0562c8b8a982700a5f942370c63309692ed" exitCode=0 Jan 21 15:48:37 crc kubenswrapper[4773]: I0121 15:48:37.017201 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p97m2" event={"ID":"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41","Type":"ContainerDied","Data":"c59894bbd3d82a699d43d1d37f0de0562c8b8a982700a5f942370c63309692ed"} Jan 21 15:48:38 crc kubenswrapper[4773]: I0121 15:48:38.034591 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c7bac139-85d2-4d70-b755-22c0e0e8fa92","Type":"ContainerStarted","Data":"dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7"} Jan 21 15:48:38 crc kubenswrapper[4773]: I0121 15:48:38.034937 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c7bac139-85d2-4d70-b755-22c0e0e8fa92","Type":"ContainerStarted","Data":"ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6"} Jan 21 15:48:38 crc kubenswrapper[4773]: I0121 15:48:38.035838 4773 generic.go:334] "Generic (PLEG): container finished" podID="5bfe59ba-7b63-4e75-825b-05b03663557b" containerID="abf92234ffc1d745ba6bda73402cfe734e11c2572ac7043a6cf0fc19945b7df5" exitCode=0 Jan 21 15:48:38 crc kubenswrapper[4773]: I0121 15:48:38.035901 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-40ec-account-create-update-8t8mw" event={"ID":"5bfe59ba-7b63-4e75-825b-05b03663557b","Type":"ContainerDied","Data":"abf92234ffc1d745ba6bda73402cfe734e11c2572ac7043a6cf0fc19945b7df5"} Jan 21 15:48:38 crc kubenswrapper[4773]: I0121 15:48:38.037883 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d34079f2-2d08-4ddc-8d49-a9afaadaba8c","Type":"ContainerStarted","Data":"6b6e7c0af2f2459f70ec074c094e6e1a7358609b78d38ec7cb2ed99eefa82729"} Jan 21 15:48:38 crc kubenswrapper[4773]: I0121 15:48:38.039566 4773 generic.go:334] "Generic (PLEG): container finished" podID="dede4ee8-95b2-4feb-bedb-4fb9615014f4" containerID="6be08ef4f36b8ad74ccf236c93896982efedc5d5f17b0cdaef5550cf94a567fa" exitCode=0 Jan 21 15:48:38 crc kubenswrapper[4773]: I0121 15:48:38.039683 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" event={"ID":"dede4ee8-95b2-4feb-bedb-4fb9615014f4","Type":"ContainerDied","Data":"6be08ef4f36b8ad74ccf236c93896982efedc5d5f17b0cdaef5550cf94a567fa"} Jan 21 15:48:38 crc kubenswrapper[4773]: I0121 15:48:38.128207 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.776380604 podStartE2EDuration="53.128184816s" podCreationTimestamp="2026-01-21 15:47:45 +0000 UTC" firstStartedPulling="2026-01-21 15:47:46.37177304 +0000 UTC m=+1431.296262662" lastFinishedPulling="2026-01-21 15:48:36.723577252 +0000 UTC m=+1481.648066874" observedRunningTime="2026-01-21 15:48:38.082884828 +0000 UTC m=+1483.007374450" watchObservedRunningTime="2026-01-21 15:48:38.128184816 +0000 UTC m=+1483.052674438" Jan 21 15:48:38 crc kubenswrapper[4773]: I0121 15:48:38.696097 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ss2cc" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:38.800079 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e686b20-33ac-474d-a46b-ea308b32cbf3-operator-scripts\") pod \"4e686b20-33ac-474d-a46b-ea308b32cbf3\" (UID: \"4e686b20-33ac-474d-a46b-ea308b32cbf3\") " Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:38.800183 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw99b\" (UniqueName: \"kubernetes.io/projected/4e686b20-33ac-474d-a46b-ea308b32cbf3-kube-api-access-dw99b\") pod \"4e686b20-33ac-474d-a46b-ea308b32cbf3\" (UID: \"4e686b20-33ac-474d-a46b-ea308b32cbf3\") " Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:38.802123 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e686b20-33ac-474d-a46b-ea308b32cbf3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e686b20-33ac-474d-a46b-ea308b32cbf3" (UID: "4e686b20-33ac-474d-a46b-ea308b32cbf3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:38.818366 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e686b20-33ac-474d-a46b-ea308b32cbf3-kube-api-access-dw99b" (OuterVolumeSpecName: "kube-api-access-dw99b") pod "4e686b20-33ac-474d-a46b-ea308b32cbf3" (UID: "4e686b20-33ac-474d-a46b-ea308b32cbf3"). InnerVolumeSpecName "kube-api-access-dw99b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:38.902261 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e686b20-33ac-474d-a46b-ea308b32cbf3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:38.902291 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw99b\" (UniqueName: \"kubernetes.io/projected/4e686b20-33ac-474d-a46b-ea308b32cbf3-kube-api-access-dw99b\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:38.922820 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f175-account-create-update-rdlpn" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:38.931991 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k9sdj" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:38.938134 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p97m2" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.058947 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f175-account-create-update-rdlpn" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.059460 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f175-account-create-update-rdlpn" event={"ID":"057373ae-c81b-4e2b-a71c-aa81a62c4465","Type":"ContainerDied","Data":"9ad64c70949ee14e8b46725c6b8620b7b63794112e12090057f02ff6f71ee170"} Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.059504 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ad64c70949ee14e8b46725c6b8620b7b63794112e12090057f02ff6f71ee170" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.064807 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p97m2" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.065371 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p97m2" event={"ID":"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41","Type":"ContainerDied","Data":"ca52b6c359e4ebc87893029eda712f64b4563f3945b2781721e0837cfbd79b62"} Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.065418 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca52b6c359e4ebc87893029eda712f64b4563f3945b2781721e0837cfbd79b62" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.068103 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k9sdj" event={"ID":"3f53fe25-e64e-4570-989f-ae65e7e23a71","Type":"ContainerDied","Data":"2e6bee84243578d87447295e65b19ac97c05290da3394d36f9b0e4e0264bd68d"} Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.068129 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e6bee84243578d87447295e65b19ac97c05290da3394d36f9b0e4e0264bd68d" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.068198 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k9sdj" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.076154 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ss2cc" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.076824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ss2cc" event={"ID":"4e686b20-33ac-474d-a46b-ea308b32cbf3","Type":"ContainerDied","Data":"49825e98183e092d50e9a9e453894d403376a0dd0b5a334c62c57a25f6f9d892"} Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.076868 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49825e98183e092d50e9a9e453894d403376a0dd0b5a334c62c57a25f6f9d892" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.078219 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.105996 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxfzs\" (UniqueName: \"kubernetes.io/projected/057373ae-c81b-4e2b-a71c-aa81a62c4465-kube-api-access-bxfzs\") pod \"057373ae-c81b-4e2b-a71c-aa81a62c4465\" (UID: \"057373ae-c81b-4e2b-a71c-aa81a62c4465\") " Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.106050 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt9zr\" (UniqueName: \"kubernetes.io/projected/3f53fe25-e64e-4570-989f-ae65e7e23a71-kube-api-access-pt9zr\") pod \"3f53fe25-e64e-4570-989f-ae65e7e23a71\" (UID: \"3f53fe25-e64e-4570-989f-ae65e7e23a71\") " Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.106083 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f53fe25-e64e-4570-989f-ae65e7e23a71-operator-scripts\") pod \"3f53fe25-e64e-4570-989f-ae65e7e23a71\" (UID: \"3f53fe25-e64e-4570-989f-ae65e7e23a71\") " Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.106204 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/057373ae-c81b-4e2b-a71c-aa81a62c4465-operator-scripts\") pod \"057373ae-c81b-4e2b-a71c-aa81a62c4465\" (UID: \"057373ae-c81b-4e2b-a71c-aa81a62c4465\") " Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.106262 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-operator-scripts\") pod \"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41\" (UID: \"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41\") " Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.106328 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9whvb\" (UniqueName: \"kubernetes.io/projected/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-kube-api-access-9whvb\") pod \"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41\" (UID: \"e6abc2d7-7db5-4e83-9aa3-e8beb591cb41\") " Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.111047 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f53fe25-e64e-4570-989f-ae65e7e23a71-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f53fe25-e64e-4570-989f-ae65e7e23a71" (UID: "3f53fe25-e64e-4570-989f-ae65e7e23a71"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.113613 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/057373ae-c81b-4e2b-a71c-aa81a62c4465-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "057373ae-c81b-4e2b-a71c-aa81a62c4465" (UID: "057373ae-c81b-4e2b-a71c-aa81a62c4465"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.114076 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6abc2d7-7db5-4e83-9aa3-e8beb591cb41" (UID: "e6abc2d7-7db5-4e83-9aa3-e8beb591cb41"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.120624 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f53fe25-e64e-4570-989f-ae65e7e23a71-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.120994 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/057373ae-c81b-4e2b-a71c-aa81a62c4465-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.121023 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.130021 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/057373ae-c81b-4e2b-a71c-aa81a62c4465-kube-api-access-bxfzs" (OuterVolumeSpecName: "kube-api-access-bxfzs") pod "057373ae-c81b-4e2b-a71c-aa81a62c4465" (UID: "057373ae-c81b-4e2b-a71c-aa81a62c4465"). InnerVolumeSpecName "kube-api-access-bxfzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.138195 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f53fe25-e64e-4570-989f-ae65e7e23a71-kube-api-access-pt9zr" (OuterVolumeSpecName: "kube-api-access-pt9zr") pod "3f53fe25-e64e-4570-989f-ae65e7e23a71" (UID: "3f53fe25-e64e-4570-989f-ae65e7e23a71"). InnerVolumeSpecName "kube-api-access-pt9zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.144034 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-kube-api-access-9whvb" (OuterVolumeSpecName: "kube-api-access-9whvb") pod "e6abc2d7-7db5-4e83-9aa3-e8beb591cb41" (UID: "e6abc2d7-7db5-4e83-9aa3-e8beb591cb41"). InnerVolumeSpecName "kube-api-access-9whvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.158650 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=9.158621818 podStartE2EDuration="9.158621818s" podCreationTimestamp="2026-01-21 15:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:39.117370109 +0000 UTC m=+1484.041859741" watchObservedRunningTime="2026-01-21 15:48:39.158621818 +0000 UTC m=+1484.083111440" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.223136 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9whvb\" (UniqueName: \"kubernetes.io/projected/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41-kube-api-access-9whvb\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.223163 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxfzs\" (UniqueName: \"kubernetes.io/projected/057373ae-c81b-4e2b-a71c-aa81a62c4465-kube-api-access-bxfzs\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.223172 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt9zr\" (UniqueName: \"kubernetes.io/projected/3f53fe25-e64e-4570-989f-ae65e7e23a71-kube-api-access-pt9zr\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.408886 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.536026 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dede4ee8-95b2-4feb-bedb-4fb9615014f4-operator-scripts\") pod \"dede4ee8-95b2-4feb-bedb-4fb9615014f4\" (UID: \"dede4ee8-95b2-4feb-bedb-4fb9615014f4\") " Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.536157 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh5zl\" (UniqueName: \"kubernetes.io/projected/dede4ee8-95b2-4feb-bedb-4fb9615014f4-kube-api-access-kh5zl\") pod \"dede4ee8-95b2-4feb-bedb-4fb9615014f4\" (UID: \"dede4ee8-95b2-4feb-bedb-4fb9615014f4\") " Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.537563 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dede4ee8-95b2-4feb-bedb-4fb9615014f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dede4ee8-95b2-4feb-bedb-4fb9615014f4" (UID: "dede4ee8-95b2-4feb-bedb-4fb9615014f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.541573 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dede4ee8-95b2-4feb-bedb-4fb9615014f4-kube-api-access-kh5zl" (OuterVolumeSpecName: "kube-api-access-kh5zl") pod "dede4ee8-95b2-4feb-bedb-4fb9615014f4" (UID: "dede4ee8-95b2-4feb-bedb-4fb9615014f4"). InnerVolumeSpecName "kube-api-access-kh5zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.638833 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dede4ee8-95b2-4feb-bedb-4fb9615014f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:39.638869 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh5zl\" (UniqueName: \"kubernetes.io/projected/dede4ee8-95b2-4feb-bedb-4fb9615014f4-kube-api-access-kh5zl\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:40.086811 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:40.086798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d93f-account-create-update-6wdd5" event={"ID":"dede4ee8-95b2-4feb-bedb-4fb9615014f4","Type":"ContainerDied","Data":"4c50a745e3d4514699ca926259c9065f1937bf6d35bac8948584791ed6e0badd"} Jan 21 15:48:40 crc kubenswrapper[4773]: I0121 15:48:40.087287 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c50a745e3d4514699ca926259c9065f1937bf6d35bac8948584791ed6e0badd" Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.103433 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="ceilometer-central-agent" containerID="cri-o://95db7b433113bab874aa46ede8c92d9d521d5215695f767e19829c83a738dc12" gracePeriod=30 Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.103446 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="proxy-httpd" containerID="cri-o://be6d374845ae835ce4de7ac224454c58524b035c9f1f85608cdf4e3a9ccb77da" gracePeriod=30 Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.103518 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="sg-core" containerID="cri-o://2910f05f3469fa93ab094db87835eed9154ff593e953e2fe159bd5bbfd6c2379" gracePeriod=30 Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.103533 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="ceilometer-notification-agent" containerID="cri-o://a70d3616aa53a047fd8525d757af6e3e86b9d201e005e7c7056219fc27b29a9e" gracePeriod=30 Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.103706 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af63624d-c514-4d75-a541-5948d7981c1e","Type":"ContainerStarted","Data":"be6d374845ae835ce4de7ac224454c58524b035c9f1f85608cdf4e3a9ccb77da"} Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.104655 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.107144 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-40ec-account-create-update-8t8mw" event={"ID":"5bfe59ba-7b63-4e75-825b-05b03663557b","Type":"ContainerDied","Data":"c961bd2c0275ebe6bdb6b19f673b52f42278a12a8b8be84847b06a8558c9840c"} Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.107192 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c961bd2c0275ebe6bdb6b19f673b52f42278a12a8b8be84847b06a8558c9840c" Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.119852 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-40ec-account-create-update-8t8mw" Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.136350 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.293225486 podStartE2EDuration="40.136332125s" podCreationTimestamp="2026-01-21 15:48:01 +0000 UTC" firstStartedPulling="2026-01-21 15:48:02.905924417 +0000 UTC m=+1447.830414039" lastFinishedPulling="2026-01-21 15:48:39.749031056 +0000 UTC m=+1484.673520678" observedRunningTime="2026-01-21 15:48:41.130331674 +0000 UTC m=+1486.054821306" watchObservedRunningTime="2026-01-21 15:48:41.136332125 +0000 UTC m=+1486.060821747" Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.264743 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.265022 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8cf09d70-1803-460f-ba8d-0434313796cf" containerName="glance-log" containerID="cri-o://c7d16007bee633cfb0a063e976216e8c7a67a364a72cd792889055d53bed6133" gracePeriod=30 Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.265588 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8cf09d70-1803-460f-ba8d-0434313796cf" containerName="glance-httpd" containerID="cri-o://58e704fd2700629523c14d0263c082cfa2eeefdb8d62ab5033503ff2a081525f" gracePeriod=30 Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.282823 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvhh2\" (UniqueName: \"kubernetes.io/projected/5bfe59ba-7b63-4e75-825b-05b03663557b-kube-api-access-vvhh2\") pod \"5bfe59ba-7b63-4e75-825b-05b03663557b\" (UID: \"5bfe59ba-7b63-4e75-825b-05b03663557b\") " Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.282913 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bfe59ba-7b63-4e75-825b-05b03663557b-operator-scripts\") pod \"5bfe59ba-7b63-4e75-825b-05b03663557b\" (UID: \"5bfe59ba-7b63-4e75-825b-05b03663557b\") " Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.283642 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bfe59ba-7b63-4e75-825b-05b03663557b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5bfe59ba-7b63-4e75-825b-05b03663557b" (UID: "5bfe59ba-7b63-4e75-825b-05b03663557b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.290896 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bfe59ba-7b63-4e75-825b-05b03663557b-kube-api-access-vvhh2" (OuterVolumeSpecName: "kube-api-access-vvhh2") pod "5bfe59ba-7b63-4e75-825b-05b03663557b" (UID: "5bfe59ba-7b63-4e75-825b-05b03663557b"). InnerVolumeSpecName "kube-api-access-vvhh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.385244 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bfe59ba-7b63-4e75-825b-05b03663557b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:41 crc kubenswrapper[4773]: I0121 15:48:41.385282 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvhh2\" (UniqueName: \"kubernetes.io/projected/5bfe59ba-7b63-4e75-825b-05b03663557b-kube-api-access-vvhh2\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:42 crc kubenswrapper[4773]: I0121 15:48:42.126625 4773 generic.go:334] "Generic (PLEG): container finished" podID="8cf09d70-1803-460f-ba8d-0434313796cf" containerID="c7d16007bee633cfb0a063e976216e8c7a67a364a72cd792889055d53bed6133" exitCode=143 Jan 21 15:48:42 crc kubenswrapper[4773]: I0121 15:48:42.126703 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cf09d70-1803-460f-ba8d-0434313796cf","Type":"ContainerDied","Data":"c7d16007bee633cfb0a063e976216e8c7a67a364a72cd792889055d53bed6133"} Jan 21 15:48:42 crc kubenswrapper[4773]: I0121 15:48:42.129907 4773 generic.go:334] "Generic (PLEG): container finished" podID="af63624d-c514-4d75-a541-5948d7981c1e" containerID="be6d374845ae835ce4de7ac224454c58524b035c9f1f85608cdf4e3a9ccb77da" exitCode=0 Jan 21 15:48:42 crc kubenswrapper[4773]: I0121 15:48:42.129932 4773 generic.go:334] "Generic (PLEG): container finished" podID="af63624d-c514-4d75-a541-5948d7981c1e" containerID="2910f05f3469fa93ab094db87835eed9154ff593e953e2fe159bd5bbfd6c2379" exitCode=2 Jan 21 15:48:42 crc kubenswrapper[4773]: I0121 15:48:42.129942 4773 generic.go:334] "Generic (PLEG): container finished" podID="af63624d-c514-4d75-a541-5948d7981c1e" containerID="95db7b433113bab874aa46ede8c92d9d521d5215695f767e19829c83a738dc12" exitCode=0 Jan 21 15:48:42 crc kubenswrapper[4773]: I0121 15:48:42.130006 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-40ec-account-create-update-8t8mw" Jan 21 15:48:42 crc kubenswrapper[4773]: I0121 15:48:42.130875 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af63624d-c514-4d75-a541-5948d7981c1e","Type":"ContainerDied","Data":"be6d374845ae835ce4de7ac224454c58524b035c9f1f85608cdf4e3a9ccb77da"} Jan 21 15:48:42 crc kubenswrapper[4773]: I0121 15:48:42.130904 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af63624d-c514-4d75-a541-5948d7981c1e","Type":"ContainerDied","Data":"2910f05f3469fa93ab094db87835eed9154ff593e953e2fe159bd5bbfd6c2379"} Jan 21 15:48:42 crc kubenswrapper[4773]: I0121 15:48:42.130915 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af63624d-c514-4d75-a541-5948d7981c1e","Type":"ContainerDied","Data":"95db7b433113bab874aa46ede8c92d9d521d5215695f767e19829c83a738dc12"} Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.205055 4773 generic.go:334] "Generic (PLEG): container finished" podID="af63624d-c514-4d75-a541-5948d7981c1e" containerID="a70d3616aa53a047fd8525d757af6e3e86b9d201e005e7c7056219fc27b29a9e" exitCode=0 Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.205362 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af63624d-c514-4d75-a541-5948d7981c1e","Type":"ContainerDied","Data":"a70d3616aa53a047fd8525d757af6e3e86b9d201e005e7c7056219fc27b29a9e"} Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.391991 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.529106 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-run-httpd\") pod \"af63624d-c514-4d75-a541-5948d7981c1e\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.529271 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-sg-core-conf-yaml\") pod \"af63624d-c514-4d75-a541-5948d7981c1e\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.529320 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-scripts\") pod \"af63624d-c514-4d75-a541-5948d7981c1e\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.529426 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-config-data\") pod \"af63624d-c514-4d75-a541-5948d7981c1e\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.529576 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-combined-ca-bundle\") pod \"af63624d-c514-4d75-a541-5948d7981c1e\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.529647 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-log-httpd\") pod \"af63624d-c514-4d75-a541-5948d7981c1e\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.529678 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prlbx\" (UniqueName: \"kubernetes.io/projected/af63624d-c514-4d75-a541-5948d7981c1e-kube-api-access-prlbx\") pod \"af63624d-c514-4d75-a541-5948d7981c1e\" (UID: \"af63624d-c514-4d75-a541-5948d7981c1e\") " Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.529921 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "af63624d-c514-4d75-a541-5948d7981c1e" (UID: "af63624d-c514-4d75-a541-5948d7981c1e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.531294 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "af63624d-c514-4d75-a541-5948d7981c1e" (UID: "af63624d-c514-4d75-a541-5948d7981c1e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.531869 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.531891 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af63624d-c514-4d75-a541-5948d7981c1e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.536106 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af63624d-c514-4d75-a541-5948d7981c1e-kube-api-access-prlbx" (OuterVolumeSpecName: "kube-api-access-prlbx") pod "af63624d-c514-4d75-a541-5948d7981c1e" (UID: "af63624d-c514-4d75-a541-5948d7981c1e"). InnerVolumeSpecName "kube-api-access-prlbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.536881 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-scripts" (OuterVolumeSpecName: "scripts") pod "af63624d-c514-4d75-a541-5948d7981c1e" (UID: "af63624d-c514-4d75-a541-5948d7981c1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.578303 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "af63624d-c514-4d75-a541-5948d7981c1e" (UID: "af63624d-c514-4d75-a541-5948d7981c1e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.634814 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prlbx\" (UniqueName: \"kubernetes.io/projected/af63624d-c514-4d75-a541-5948d7981c1e-kube-api-access-prlbx\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.634845 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.634854 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.638421 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af63624d-c514-4d75-a541-5948d7981c1e" (UID: "af63624d-c514-4d75-a541-5948d7981c1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.691870 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-config-data" (OuterVolumeSpecName: "config-data") pod "af63624d-c514-4d75-a541-5948d7981c1e" (UID: "af63624d-c514-4d75-a541-5948d7981c1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.736496 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:43 crc kubenswrapper[4773]: I0121 15:48:43.736790 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af63624d-c514-4d75-a541-5948d7981c1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.240173 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af63624d-c514-4d75-a541-5948d7981c1e","Type":"ContainerDied","Data":"7451f28401156a2225b0bb6dc1cb5fd23171f9e280c23c31f857034ece3e30c3"} Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.240233 4773 scope.go:117] "RemoveContainer" containerID="be6d374845ae835ce4de7ac224454c58524b035c9f1f85608cdf4e3a9ccb77da" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.240253 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.289607 4773 scope.go:117] "RemoveContainer" containerID="2910f05f3469fa93ab094db87835eed9154ff593e953e2fe159bd5bbfd6c2379" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.339961 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.368908 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.394296 4773 scope.go:117] "RemoveContainer" containerID="a70d3616aa53a047fd8525d757af6e3e86b9d201e005e7c7056219fc27b29a9e" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.447830 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:44 crc kubenswrapper[4773]: E0121 15:48:44.448662 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f53fe25-e64e-4570-989f-ae65e7e23a71" containerName="mariadb-database-create" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.448680 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f53fe25-e64e-4570-989f-ae65e7e23a71" containerName="mariadb-database-create" Jan 21 15:48:44 crc kubenswrapper[4773]: E0121 15:48:44.448726 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057373ae-c81b-4e2b-a71c-aa81a62c4465" containerName="mariadb-account-create-update" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.448737 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="057373ae-c81b-4e2b-a71c-aa81a62c4465" containerName="mariadb-account-create-update" Jan 21 15:48:44 crc kubenswrapper[4773]: E0121 15:48:44.448761 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e686b20-33ac-474d-a46b-ea308b32cbf3" containerName="mariadb-database-create" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.448768 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e686b20-33ac-474d-a46b-ea308b32cbf3" containerName="mariadb-database-create" Jan 21 15:48:44 crc kubenswrapper[4773]: E0121 15:48:44.448813 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="ceilometer-central-agent" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.448820 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="ceilometer-central-agent" Jan 21 15:48:44 crc kubenswrapper[4773]: E0121 15:48:44.448852 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bfe59ba-7b63-4e75-825b-05b03663557b" containerName="mariadb-account-create-update" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.448859 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bfe59ba-7b63-4e75-825b-05b03663557b" containerName="mariadb-account-create-update" Jan 21 15:48:44 crc kubenswrapper[4773]: E0121 15:48:44.448882 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dede4ee8-95b2-4feb-bedb-4fb9615014f4" containerName="mariadb-account-create-update" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.448889 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dede4ee8-95b2-4feb-bedb-4fb9615014f4" containerName="mariadb-account-create-update" Jan 21 15:48:44 crc kubenswrapper[4773]: E0121 15:48:44.448905 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6abc2d7-7db5-4e83-9aa3-e8beb591cb41" containerName="mariadb-database-create" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.448918 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6abc2d7-7db5-4e83-9aa3-e8beb591cb41" containerName="mariadb-database-create" Jan 21 15:48:44 crc kubenswrapper[4773]: E0121 15:48:44.448934 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="ceilometer-notification-agent" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.448943 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="ceilometer-notification-agent" Jan 21 15:48:44 crc kubenswrapper[4773]: E0121 15:48:44.448969 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="proxy-httpd" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.448976 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="proxy-httpd" Jan 21 15:48:44 crc kubenswrapper[4773]: E0121 15:48:44.449014 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="sg-core" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.449022 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="sg-core" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.449408 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="ceilometer-central-agent" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.449440 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f53fe25-e64e-4570-989f-ae65e7e23a71" containerName="mariadb-database-create" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.449462 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="dede4ee8-95b2-4feb-bedb-4fb9615014f4" containerName="mariadb-account-create-update" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.449489 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e686b20-33ac-474d-a46b-ea308b32cbf3" containerName="mariadb-database-create" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.449504 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="proxy-httpd" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.449513 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bfe59ba-7b63-4e75-825b-05b03663557b" containerName="mariadb-account-create-update" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.449560 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="sg-core" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.449576 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="057373ae-c81b-4e2b-a71c-aa81a62c4465" containerName="mariadb-account-create-update" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.449590 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6abc2d7-7db5-4e83-9aa3-e8beb591cb41" containerName="mariadb-database-create" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.449599 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="af63624d-c514-4d75-a541-5948d7981c1e" containerName="ceilometer-notification-agent" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.460858 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.467743 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.475607 4773 scope.go:117] "RemoveContainer" containerID="95db7b433113bab874aa46ede8c92d9d521d5215695f767e19829c83a738dc12" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.478716 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.508070 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.589989 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvncx\" (UniqueName: \"kubernetes.io/projected/b908d70f-6a28-48d0-92db-09566c887f6c-kube-api-access-rvncx\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.590109 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-config-data\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.590192 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.590335 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.590365 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-scripts\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.590521 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-log-httpd\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.590645 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-run-httpd\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.693010 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvncx\" (UniqueName: \"kubernetes.io/projected/b908d70f-6a28-48d0-92db-09566c887f6c-kube-api-access-rvncx\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.693081 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-config-data\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.693121 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.693179 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.693202 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-scripts\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.693269 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-log-httpd\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.693315 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-run-httpd\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.693802 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-run-httpd\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.694792 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-log-httpd\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.705531 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-scripts\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.709277 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.710561 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.714816 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-config-data\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.731550 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvncx\" (UniqueName: \"kubernetes.io/projected/b908d70f-6a28-48d0-92db-09566c887f6c-kube-api-access-rvncx\") pod \"ceilometer-0\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.803253 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.816115 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6n97z"] Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.817809 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.822049 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j2g5c" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.822292 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.822438 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.883741 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6n97z"] Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.897018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-scripts\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.897145 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhvwh\" (UniqueName: \"kubernetes.io/projected/b95f2a61-8fc4-4257-ad91-d0b45169dc09-kube-api-access-lhvwh\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.897178 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-config-data\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.897202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.999855 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhvwh\" (UniqueName: \"kubernetes.io/projected/b95f2a61-8fc4-4257-ad91-d0b45169dc09-kube-api-access-lhvwh\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:44 crc kubenswrapper[4773]: I0121 15:48:44.999971 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-config-data\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.000004 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.000321 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-scripts\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.006253 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-scripts\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.006797 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.012639 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-config-data\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.026673 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhvwh\" (UniqueName: \"kubernetes.io/projected/b95f2a61-8fc4-4257-ad91-d0b45169dc09-kube-api-access-lhvwh\") pod \"nova-cell0-conductor-db-sync-6n97z\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.223347 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.295564 4773 generic.go:334] "Generic (PLEG): container finished" podID="8cf09d70-1803-460f-ba8d-0434313796cf" containerID="58e704fd2700629523c14d0263c082cfa2eeefdb8d62ab5033503ff2a081525f" exitCode=0 Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.296067 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cf09d70-1803-460f-ba8d-0434313796cf","Type":"ContainerDied","Data":"58e704fd2700629523c14d0263c082cfa2eeefdb8d62ab5033503ff2a081525f"} Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.401755 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af63624d-c514-4d75-a541-5948d7981c1e" path="/var/lib/kubelet/pods/af63624d-c514-4d75-a541-5948d7981c1e/volumes" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.434160 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.514944 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"8cf09d70-1803-460f-ba8d-0434313796cf\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.515019 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-internal-tls-certs\") pod \"8cf09d70-1803-460f-ba8d-0434313796cf\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.515050 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-httpd-run\") pod \"8cf09d70-1803-460f-ba8d-0434313796cf\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.515094 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-config-data\") pod \"8cf09d70-1803-460f-ba8d-0434313796cf\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.515199 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx8s5\" (UniqueName: \"kubernetes.io/projected/8cf09d70-1803-460f-ba8d-0434313796cf-kube-api-access-sx8s5\") pod \"8cf09d70-1803-460f-ba8d-0434313796cf\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.515241 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-logs\") pod \"8cf09d70-1803-460f-ba8d-0434313796cf\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.515320 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-combined-ca-bundle\") pod \"8cf09d70-1803-460f-ba8d-0434313796cf\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.515415 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-scripts\") pod \"8cf09d70-1803-460f-ba8d-0434313796cf\" (UID: \"8cf09d70-1803-460f-ba8d-0434313796cf\") " Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.515536 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8cf09d70-1803-460f-ba8d-0434313796cf" (UID: "8cf09d70-1803-460f-ba8d-0434313796cf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.516222 4773 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.518804 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-logs" (OuterVolumeSpecName: "logs") pod "8cf09d70-1803-460f-ba8d-0434313796cf" (UID: "8cf09d70-1803-460f-ba8d-0434313796cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.533211 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cf09d70-1803-460f-ba8d-0434313796cf-kube-api-access-sx8s5" (OuterVolumeSpecName: "kube-api-access-sx8s5") pod "8cf09d70-1803-460f-ba8d-0434313796cf" (UID: "8cf09d70-1803-460f-ba8d-0434313796cf"). InnerVolumeSpecName "kube-api-access-sx8s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.533371 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-scripts" (OuterVolumeSpecName: "scripts") pod "8cf09d70-1803-460f-ba8d-0434313796cf" (UID: "8cf09d70-1803-460f-ba8d-0434313796cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.561854 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.584008 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151" (OuterVolumeSpecName: "glance") pod "8cf09d70-1803-460f-ba8d-0434313796cf" (UID: "8cf09d70-1803-460f-ba8d-0434313796cf"). InnerVolumeSpecName "pvc-5fed4dce-f776-48ab-b524-c032df929151". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.620427 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cf09d70-1803-460f-ba8d-0434313796cf" (UID: "8cf09d70-1803-460f-ba8d-0434313796cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.620652 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") on node \"crc\" " Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.620678 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx8s5\" (UniqueName: \"kubernetes.io/projected/8cf09d70-1803-460f-ba8d-0434313796cf-kube-api-access-sx8s5\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.620707 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf09d70-1803-460f-ba8d-0434313796cf-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.620764 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.651413 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8cf09d70-1803-460f-ba8d-0434313796cf" (UID: "8cf09d70-1803-460f-ba8d-0434313796cf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.654792 4773 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.654957 4773 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5fed4dce-f776-48ab-b524-c032df929151" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151") on node "crc" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.674629 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-config-data" (OuterVolumeSpecName: "config-data") pod "8cf09d70-1803-460f-ba8d-0434313796cf" (UID: "8cf09d70-1803-460f-ba8d-0434313796cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.725090 4773 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.725122 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.725131 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf09d70-1803-460f-ba8d-0434313796cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.725141 4773 reconciler_common.go:293] "Volume detached for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:45 crc kubenswrapper[4773]: W0121 15:48:45.836667 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb95f2a61_8fc4_4257_ad91_d0b45169dc09.slice/crio-dd46dae15a80c85aa6291fa9c4e72e7da3f1cc375007fb34cf6e2f96aecd3799 WatchSource:0}: Error finding container dd46dae15a80c85aa6291fa9c4e72e7da3f1cc375007fb34cf6e2f96aecd3799: Status 404 returned error can't find the container with id dd46dae15a80c85aa6291fa9c4e72e7da3f1cc375007fb34cf6e2f96aecd3799 Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.838938 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6n97z"] Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.852539 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.852925 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cb19b844-b560-4be2-8709-f78158c0eb36" containerName="glance-httpd" containerID="cri-o://cdc153495a400da1fe5074cd3f66136a2bf1ea23e35cfac649ae0bd5759b7bec" gracePeriod=30 Jan 21 15:48:45 crc kubenswrapper[4773]: I0121 15:48:45.852961 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cb19b844-b560-4be2-8709-f78158c0eb36" containerName="glance-log" containerID="cri-o://b53978a831c8cdfbd46b0bc68784acf220e232095b7967ac9b5dc22720dc77c5" gracePeriod=30 Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.308240 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cf09d70-1803-460f-ba8d-0434313796cf","Type":"ContainerDied","Data":"1375bb58a3b7bcd7928d8fab95118d4d463c48ca84d5e0e7cef0633c40320155"} Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.308290 4773 scope.go:117] "RemoveContainer" containerID="58e704fd2700629523c14d0263c082cfa2eeefdb8d62ab5033503ff2a081525f" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.308373 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.315568 4773 generic.go:334] "Generic (PLEG): container finished" podID="cb19b844-b560-4be2-8709-f78158c0eb36" containerID="b53978a831c8cdfbd46b0bc68784acf220e232095b7967ac9b5dc22720dc77c5" exitCode=143 Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.315648 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb19b844-b560-4be2-8709-f78158c0eb36","Type":"ContainerDied","Data":"b53978a831c8cdfbd46b0bc68784acf220e232095b7967ac9b5dc22720dc77c5"} Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.317334 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b908d70f-6a28-48d0-92db-09566c887f6c","Type":"ContainerStarted","Data":"2640643e8ff9eb0fda0efbaaa96501aa11b414c6cbe09faf930ba39a8c7d01da"} Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.318448 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6n97z" event={"ID":"b95f2a61-8fc4-4257-ad91-d0b45169dc09","Type":"ContainerStarted","Data":"dd46dae15a80c85aa6291fa9c4e72e7da3f1cc375007fb34cf6e2f96aecd3799"} Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.372785 4773 scope.go:117] "RemoveContainer" containerID="c7d16007bee633cfb0a063e976216e8c7a67a364a72cd792889055d53bed6133" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.389991 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.406247 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.415379 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:48:46 crc kubenswrapper[4773]: E0121 15:48:46.415880 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf09d70-1803-460f-ba8d-0434313796cf" containerName="glance-log" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.415903 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf09d70-1803-460f-ba8d-0434313796cf" containerName="glance-log" Jan 21 15:48:46 crc kubenswrapper[4773]: E0121 15:48:46.415954 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf09d70-1803-460f-ba8d-0434313796cf" containerName="glance-httpd" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.415963 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf09d70-1803-460f-ba8d-0434313796cf" containerName="glance-httpd" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.416192 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf09d70-1803-460f-ba8d-0434313796cf" containerName="glance-httpd" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.416215 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf09d70-1803-460f-ba8d-0434313796cf" containerName="glance-log" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.417667 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.428816 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.433246 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.433284 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.541865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.541910 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/805f46cc-de71-4353-9cb3-075eb306ace0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.541936 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.541997 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.542038 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.542067 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq7xz\" (UniqueName: \"kubernetes.io/projected/805f46cc-de71-4353-9cb3-075eb306ace0-kube-api-access-hq7xz\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.542106 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/805f46cc-de71-4353-9cb3-075eb306ace0-logs\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.542143 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.644140 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/805f46cc-de71-4353-9cb3-075eb306ace0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.644208 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.644267 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.644319 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.644358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq7xz\" (UniqueName: \"kubernetes.io/projected/805f46cc-de71-4353-9cb3-075eb306ace0-kube-api-access-hq7xz\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.644392 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/805f46cc-de71-4353-9cb3-075eb306ace0-logs\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.644444 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.644548 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.644777 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/805f46cc-de71-4353-9cb3-075eb306ace0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.645018 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/805f46cc-de71-4353-9cb3-075eb306ace0-logs\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.653609 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.653658 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/149dd6dfda276adff7f1f12e0c1d439e14e49afb630f1d98c3833562ebaefedd/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.656110 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.656848 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.670060 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.678920 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/805f46cc-de71-4353-9cb3-075eb306ace0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.683485 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq7xz\" (UniqueName: \"kubernetes.io/projected/805f46cc-de71-4353-9cb3-075eb306ace0-kube-api-access-hq7xz\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:46 crc kubenswrapper[4773]: I0121 15:48:46.914752 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5fed4dce-f776-48ab-b524-c032df929151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fed4dce-f776-48ab-b524-c032df929151\") pod \"glance-default-internal-api-0\" (UID: \"805f46cc-de71-4353-9cb3-075eb306ace0\") " pod="openstack/glance-default-internal-api-0" Jan 21 15:48:47 crc kubenswrapper[4773]: I0121 15:48:47.060286 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 15:48:47 crc kubenswrapper[4773]: I0121 15:48:47.348760 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b908d70f-6a28-48d0-92db-09566c887f6c","Type":"ContainerStarted","Data":"7316e6ed9b66ef385757fee57c24157736c92ea2cc0fb058daa2f3b19a427d6f"} Jan 21 15:48:47 crc kubenswrapper[4773]: I0121 15:48:47.398183 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cf09d70-1803-460f-ba8d-0434313796cf" path="/var/lib/kubelet/pods/8cf09d70-1803-460f-ba8d-0434313796cf/volumes" Jan 21 15:48:47 crc kubenswrapper[4773]: I0121 15:48:47.742466 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 15:48:48 crc kubenswrapper[4773]: I0121 15:48:48.367347 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b908d70f-6a28-48d0-92db-09566c887f6c","Type":"ContainerStarted","Data":"552e189394418b62bda9c847ad868c0406ea1e32b4c774500d7ed2027c7e1368"} Jan 21 15:48:48 crc kubenswrapper[4773]: I0121 15:48:48.369955 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"805f46cc-de71-4353-9cb3-075eb306ace0","Type":"ContainerStarted","Data":"c4fe8202b5cf3374f6f102d584db9a2520b6c9625dbfdf2fc3d6777018c1861a"} Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.403602 4773 generic.go:334] "Generic (PLEG): container finished" podID="cb19b844-b560-4be2-8709-f78158c0eb36" containerID="cdc153495a400da1fe5074cd3f66136a2bf1ea23e35cfac649ae0bd5759b7bec" exitCode=0 Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.406375 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"805f46cc-de71-4353-9cb3-075eb306ace0","Type":"ContainerStarted","Data":"f1c51fa45301c05b5214f906a01fbe45e444803c3d64fadfb0eae36dc57d4203"} Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.406425 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"805f46cc-de71-4353-9cb3-075eb306ace0","Type":"ContainerStarted","Data":"54a0ae643b1d43344fc09e5f0067ce06662afa52f860d72e9d09dc5e12682ae7"} Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.406439 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb19b844-b560-4be2-8709-f78158c0eb36","Type":"ContainerDied","Data":"cdc153495a400da1fe5074cd3f66136a2bf1ea23e35cfac649ae0bd5759b7bec"} Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.426389 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b908d70f-6a28-48d0-92db-09566c887f6c","Type":"ContainerStarted","Data":"c87eb86aa1f2a2c35c80407865cc05d7fd72731bdecb2b8f21f72a81e78711b7"} Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.430406 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.430382738 podStartE2EDuration="3.430382738s" podCreationTimestamp="2026-01-21 15:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:48:49.425115814 +0000 UTC m=+1494.349605446" watchObservedRunningTime="2026-01-21 15:48:49.430382738 +0000 UTC m=+1494.354872360" Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.796496 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.990159 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-httpd-run\") pod \"cb19b844-b560-4be2-8709-f78158c0eb36\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.990279 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-scripts\") pod \"cb19b844-b560-4be2-8709-f78158c0eb36\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.990478 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-config-data\") pod \"cb19b844-b560-4be2-8709-f78158c0eb36\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.990650 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"cb19b844-b560-4be2-8709-f78158c0eb36\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.990732 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-public-tls-certs\") pod \"cb19b844-b560-4be2-8709-f78158c0eb36\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.990791 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-combined-ca-bundle\") pod \"cb19b844-b560-4be2-8709-f78158c0eb36\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.990904 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dghfb\" (UniqueName: \"kubernetes.io/projected/cb19b844-b560-4be2-8709-f78158c0eb36-kube-api-access-dghfb\") pod \"cb19b844-b560-4be2-8709-f78158c0eb36\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.990937 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-logs\") pod \"cb19b844-b560-4be2-8709-f78158c0eb36\" (UID: \"cb19b844-b560-4be2-8709-f78158c0eb36\") " Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.992005 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-logs" (OuterVolumeSpecName: "logs") pod "cb19b844-b560-4be2-8709-f78158c0eb36" (UID: "cb19b844-b560-4be2-8709-f78158c0eb36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:49 crc kubenswrapper[4773]: I0121 15:48:49.992807 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cb19b844-b560-4be2-8709-f78158c0eb36" (UID: "cb19b844-b560-4be2-8709-f78158c0eb36"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:49.998718 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-scripts" (OuterVolumeSpecName: "scripts") pod "cb19b844-b560-4be2-8709-f78158c0eb36" (UID: "cb19b844-b560-4be2-8709-f78158c0eb36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.002863 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb19b844-b560-4be2-8709-f78158c0eb36-kube-api-access-dghfb" (OuterVolumeSpecName: "kube-api-access-dghfb") pod "cb19b844-b560-4be2-8709-f78158c0eb36" (UID: "cb19b844-b560-4be2-8709-f78158c0eb36"). InnerVolumeSpecName "kube-api-access-dghfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.034318 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0" (OuterVolumeSpecName: "glance") pod "cb19b844-b560-4be2-8709-f78158c0eb36" (UID: "cb19b844-b560-4be2-8709-f78158c0eb36"). InnerVolumeSpecName "pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.076935 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb19b844-b560-4be2-8709-f78158c0eb36" (UID: "cb19b844-b560-4be2-8709-f78158c0eb36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.090666 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-config-data" (OuterVolumeSpecName: "config-data") pod "cb19b844-b560-4be2-8709-f78158c0eb36" (UID: "cb19b844-b560-4be2-8709-f78158c0eb36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.093237 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.093271 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dghfb\" (UniqueName: \"kubernetes.io/projected/cb19b844-b560-4be2-8709-f78158c0eb36-kube-api-access-dghfb\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.093282 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.093294 4773 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb19b844-b560-4be2-8709-f78158c0eb36-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.093303 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.093311 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.093357 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") on node \"crc\" " Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.111886 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cb19b844-b560-4be2-8709-f78158c0eb36" (UID: "cb19b844-b560-4be2-8709-f78158c0eb36"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.185786 4773 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.186899 4773 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0") on node "crc" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.194950 4773 reconciler_common.go:293] "Volume detached for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.195000 4773 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb19b844-b560-4be2-8709-f78158c0eb36-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.443678 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb19b844-b560-4be2-8709-f78158c0eb36","Type":"ContainerDied","Data":"2d2882707873a5f0b38a8f1aa367811fb5d7d432295dde9ace28b3976e867b5f"} Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.443734 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.443789 4773 scope.go:117] "RemoveContainer" containerID="cdc153495a400da1fe5074cd3f66136a2bf1ea23e35cfac649ae0bd5759b7bec" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.491611 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.493938 4773 scope.go:117] "RemoveContainer" containerID="b53978a831c8cdfbd46b0bc68784acf220e232095b7967ac9b5dc22720dc77c5" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.509756 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.530980 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:48:50 crc kubenswrapper[4773]: E0121 15:48:50.531464 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb19b844-b560-4be2-8709-f78158c0eb36" containerName="glance-httpd" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.531482 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb19b844-b560-4be2-8709-f78158c0eb36" containerName="glance-httpd" Jan 21 15:48:50 crc kubenswrapper[4773]: E0121 15:48:50.531500 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb19b844-b560-4be2-8709-f78158c0eb36" containerName="glance-log" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.531507 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb19b844-b560-4be2-8709-f78158c0eb36" containerName="glance-log" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.531735 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb19b844-b560-4be2-8709-f78158c0eb36" containerName="glance-log" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.531758 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb19b844-b560-4be2-8709-f78158c0eb36" containerName="glance-httpd" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.533625 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.536951 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.537177 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.543394 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.707796 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-logs\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.708947 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.709112 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.709367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.709445 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnd99\" (UniqueName: \"kubernetes.io/projected/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-kube-api-access-jnd99\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.709538 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.709592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.709811 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.812182 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-logs\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.812250 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.812325 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.812423 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.812454 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnd99\" (UniqueName: \"kubernetes.io/projected/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-kube-api-access-jnd99\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.812493 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.812524 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.812611 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.813458 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-logs\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.814371 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.830643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.831275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.834454 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.834504 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e6ddc15da45c425afe1609cfb31bc40a4ae0f7e1b60627fb6c6d646b1880744e/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.838334 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.840718 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.843208 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnd99\" (UniqueName: \"kubernetes.io/projected/4d7b9329-7502-4e45-bf22-cfe4d7f5451b-kube-api-access-jnd99\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:50 crc kubenswrapper[4773]: I0121 15:48:50.927065 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1711d92-ec3c-49d5-9795-bf641fe9ccc0\") pod \"glance-default-external-api-0\" (UID: \"4d7b9329-7502-4e45-bf22-cfe4d7f5451b\") " pod="openstack/glance-default-external-api-0" Jan 21 15:48:51 crc kubenswrapper[4773]: I0121 15:48:51.185396 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 15:48:51 crc kubenswrapper[4773]: I0121 15:48:51.400873 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb19b844-b560-4be2-8709-f78158c0eb36" path="/var/lib/kubelet/pods/cb19b844-b560-4be2-8709-f78158c0eb36/volumes" Jan 21 15:48:51 crc kubenswrapper[4773]: I0121 15:48:51.468964 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b908d70f-6a28-48d0-92db-09566c887f6c","Type":"ContainerStarted","Data":"8a38e6724616deb2ecd535b84f1d1816d392e4a3b2fab73c1170a296cf5938c1"} Jan 21 15:48:51 crc kubenswrapper[4773]: I0121 15:48:51.469147 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 15:48:51 crc kubenswrapper[4773]: I0121 15:48:51.503832 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.943802906 podStartE2EDuration="7.503809818s" podCreationTimestamp="2026-01-21 15:48:44 +0000 UTC" firstStartedPulling="2026-01-21 15:48:45.564790749 +0000 UTC m=+1490.489280371" lastFinishedPulling="2026-01-21 15:48:50.124797661 +0000 UTC m=+1495.049287283" observedRunningTime="2026-01-21 15:48:51.498115222 +0000 UTC m=+1496.422604864" watchObservedRunningTime="2026-01-21 15:48:51.503809818 +0000 UTC m=+1496.428299440" Jan 21 15:48:56 crc kubenswrapper[4773]: I0121 15:48:56.449456 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:56 crc kubenswrapper[4773]: I0121 15:48:56.450449 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="ceilometer-central-agent" containerID="cri-o://7316e6ed9b66ef385757fee57c24157736c92ea2cc0fb058daa2f3b19a427d6f" gracePeriod=30 Jan 21 15:48:56 crc kubenswrapper[4773]: I0121 15:48:56.450520 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="sg-core" containerID="cri-o://c87eb86aa1f2a2c35c80407865cc05d7fd72731bdecb2b8f21f72a81e78711b7" gracePeriod=30 Jan 21 15:48:56 crc kubenswrapper[4773]: I0121 15:48:56.450527 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="ceilometer-notification-agent" containerID="cri-o://552e189394418b62bda9c847ad868c0406ea1e32b4c774500d7ed2027c7e1368" gracePeriod=30 Jan 21 15:48:56 crc kubenswrapper[4773]: I0121 15:48:56.450559 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="proxy-httpd" containerID="cri-o://8a38e6724616deb2ecd535b84f1d1816d392e4a3b2fab73c1170a296cf5938c1" gracePeriod=30 Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.061369 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.063173 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.099067 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.119341 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.550655 4773 generic.go:334] "Generic (PLEG): container finished" podID="b908d70f-6a28-48d0-92db-09566c887f6c" containerID="8a38e6724616deb2ecd535b84f1d1816d392e4a3b2fab73c1170a296cf5938c1" exitCode=0 Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.551052 4773 generic.go:334] "Generic (PLEG): container finished" podID="b908d70f-6a28-48d0-92db-09566c887f6c" containerID="c87eb86aa1f2a2c35c80407865cc05d7fd72731bdecb2b8f21f72a81e78711b7" exitCode=2 Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.551064 4773 generic.go:334] "Generic (PLEG): container finished" podID="b908d70f-6a28-48d0-92db-09566c887f6c" containerID="552e189394418b62bda9c847ad868c0406ea1e32b4c774500d7ed2027c7e1368" exitCode=0 Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.551074 4773 generic.go:334] "Generic (PLEG): container finished" podID="b908d70f-6a28-48d0-92db-09566c887f6c" containerID="7316e6ed9b66ef385757fee57c24157736c92ea2cc0fb058daa2f3b19a427d6f" exitCode=0 Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.552495 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b908d70f-6a28-48d0-92db-09566c887f6c","Type":"ContainerDied","Data":"8a38e6724616deb2ecd535b84f1d1816d392e4a3b2fab73c1170a296cf5938c1"} Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.552537 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b908d70f-6a28-48d0-92db-09566c887f6c","Type":"ContainerDied","Data":"c87eb86aa1f2a2c35c80407865cc05d7fd72731bdecb2b8f21f72a81e78711b7"} Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.552563 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.552583 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b908d70f-6a28-48d0-92db-09566c887f6c","Type":"ContainerDied","Data":"552e189394418b62bda9c847ad868c0406ea1e32b4c774500d7ed2027c7e1368"} Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.552600 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b908d70f-6a28-48d0-92db-09566c887f6c","Type":"ContainerDied","Data":"7316e6ed9b66ef385757fee57c24157736c92ea2cc0fb058daa2f3b19a427d6f"} Jan 21 15:48:57 crc kubenswrapper[4773]: I0121 15:48:57.552731 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.159927 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.330508 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.469624 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-sg-core-conf-yaml\") pod \"b908d70f-6a28-48d0-92db-09566c887f6c\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.469750 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-log-httpd\") pod \"b908d70f-6a28-48d0-92db-09566c887f6c\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.469808 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-combined-ca-bundle\") pod \"b908d70f-6a28-48d0-92db-09566c887f6c\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.469835 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-run-httpd\") pod \"b908d70f-6a28-48d0-92db-09566c887f6c\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.470382 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b908d70f-6a28-48d0-92db-09566c887f6c" (UID: "b908d70f-6a28-48d0-92db-09566c887f6c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.470436 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b908d70f-6a28-48d0-92db-09566c887f6c" (UID: "b908d70f-6a28-48d0-92db-09566c887f6c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.470495 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvncx\" (UniqueName: \"kubernetes.io/projected/b908d70f-6a28-48d0-92db-09566c887f6c-kube-api-access-rvncx\") pod \"b908d70f-6a28-48d0-92db-09566c887f6c\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.471012 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-config-data\") pod \"b908d70f-6a28-48d0-92db-09566c887f6c\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.471068 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-scripts\") pod \"b908d70f-6a28-48d0-92db-09566c887f6c\" (UID: \"b908d70f-6a28-48d0-92db-09566c887f6c\") " Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.472192 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.472213 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b908d70f-6a28-48d0-92db-09566c887f6c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.475854 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-scripts" (OuterVolumeSpecName: "scripts") pod "b908d70f-6a28-48d0-92db-09566c887f6c" (UID: "b908d70f-6a28-48d0-92db-09566c887f6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.477219 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b908d70f-6a28-48d0-92db-09566c887f6c-kube-api-access-rvncx" (OuterVolumeSpecName: "kube-api-access-rvncx") pod "b908d70f-6a28-48d0-92db-09566c887f6c" (UID: "b908d70f-6a28-48d0-92db-09566c887f6c"). InnerVolumeSpecName "kube-api-access-rvncx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.511201 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b908d70f-6a28-48d0-92db-09566c887f6c" (UID: "b908d70f-6a28-48d0-92db-09566c887f6c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.574228 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvncx\" (UniqueName: \"kubernetes.io/projected/b908d70f-6a28-48d0-92db-09566c887f6c-kube-api-access-rvncx\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.574256 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.574265 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.575912 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b908d70f-6a28-48d0-92db-09566c887f6c","Type":"ContainerDied","Data":"2640643e8ff9eb0fda0efbaaa96501aa11b414c6cbe09faf930ba39a8c7d01da"} Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.575956 4773 scope.go:117] "RemoveContainer" containerID="8a38e6724616deb2ecd535b84f1d1816d392e4a3b2fab73c1170a296cf5938c1" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.576074 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.581372 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d7b9329-7502-4e45-bf22-cfe4d7f5451b","Type":"ContainerStarted","Data":"da36dcc9f8bfed9b1905b61d5ea92db400ab8b23d845d64e008979cc438438b4"} Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.613786 4773 scope.go:117] "RemoveContainer" containerID="c87eb86aa1f2a2c35c80407865cc05d7fd72731bdecb2b8f21f72a81e78711b7" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.620378 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b908d70f-6a28-48d0-92db-09566c887f6c" (UID: "b908d70f-6a28-48d0-92db-09566c887f6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.642469 4773 scope.go:117] "RemoveContainer" containerID="552e189394418b62bda9c847ad868c0406ea1e32b4c774500d7ed2027c7e1368" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.644436 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-config-data" (OuterVolumeSpecName: "config-data") pod "b908d70f-6a28-48d0-92db-09566c887f6c" (UID: "b908d70f-6a28-48d0-92db-09566c887f6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.674465 4773 scope.go:117] "RemoveContainer" containerID="7316e6ed9b66ef385757fee57c24157736c92ea2cc0fb058daa2f3b19a427d6f" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.677187 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.677222 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b908d70f-6a28-48d0-92db-09566c887f6c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.916367 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.926559 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.939550 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:58 crc kubenswrapper[4773]: E0121 15:48:58.940472 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="sg-core" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.940493 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="sg-core" Jan 21 15:48:58 crc kubenswrapper[4773]: E0121 15:48:58.940512 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="proxy-httpd" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.940519 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="proxy-httpd" Jan 21 15:48:58 crc kubenswrapper[4773]: E0121 15:48:58.940529 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="ceilometer-central-agent" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.940536 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="ceilometer-central-agent" Jan 21 15:48:58 crc kubenswrapper[4773]: E0121 15:48:58.940556 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="ceilometer-notification-agent" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.940561 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="ceilometer-notification-agent" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.940791 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="ceilometer-notification-agent" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.940806 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="sg-core" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.940819 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="proxy-httpd" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.940829 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" containerName="ceilometer-central-agent" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.942619 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.944580 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.944826 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.985505 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-config-data\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.985580 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.985626 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-log-httpd\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.985675 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-run-httpd\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.985712 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-scripts\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.985771 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4694\" (UniqueName: \"kubernetes.io/projected/3a21025b-2c22-4719-9221-07159b0ec74a-kube-api-access-m4694\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.985876 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:58 crc kubenswrapper[4773]: I0121 15:48:58.991487 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.087616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.087752 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-config-data\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.087790 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.087830 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-log-httpd\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.087887 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-run-httpd\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.087912 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-scripts\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.087965 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4694\" (UniqueName: \"kubernetes.io/projected/3a21025b-2c22-4719-9221-07159b0ec74a-kube-api-access-m4694\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.088917 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-log-httpd\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.089042 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-run-httpd\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.094570 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.098954 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-config-data\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.099108 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.099227 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-scripts\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.109190 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4694\" (UniqueName: \"kubernetes.io/projected/3a21025b-2c22-4719-9221-07159b0ec74a-kube-api-access-m4694\") pod \"ceilometer-0\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.291666 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.406276 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b908d70f-6a28-48d0-92db-09566c887f6c" path="/var/lib/kubelet/pods/b908d70f-6a28-48d0-92db-09566c887f6c/volumes" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.598564 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.598862 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.598776 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d7b9329-7502-4e45-bf22-cfe4d7f5451b","Type":"ContainerStarted","Data":"ff8ef2e4ef3564519141057bd80f12852d773bb1784993c8288d85b4730f41e8"} Jan 21 15:48:59 crc kubenswrapper[4773]: I0121 15:48:59.880894 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:48:59 crc kubenswrapper[4773]: W0121 15:48:59.887575 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a21025b_2c22_4719_9221_07159b0ec74a.slice/crio-6f77c1476f69c8b7c79b2dd55ee48c2135a9d2d9f2600f79ed325f4de9b4af8d WatchSource:0}: Error finding container 6f77c1476f69c8b7c79b2dd55ee48c2135a9d2d9f2600f79ed325f4de9b4af8d: Status 404 returned error can't find the container with id 6f77c1476f69c8b7c79b2dd55ee48c2135a9d2d9f2600f79ed325f4de9b4af8d Jan 21 15:49:00 crc kubenswrapper[4773]: I0121 15:49:00.014436 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 21 15:49:00 crc kubenswrapper[4773]: I0121 15:49:00.230031 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 21 15:49:00 crc kubenswrapper[4773]: I0121 15:49:00.609266 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6n97z" event={"ID":"b95f2a61-8fc4-4257-ad91-d0b45169dc09","Type":"ContainerStarted","Data":"b48d8918e0041b67b95045ce1886f07f41b253b7b11712c8b5045b0015b572d0"} Jan 21 15:49:00 crc kubenswrapper[4773]: I0121 15:49:00.611980 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a21025b-2c22-4719-9221-07159b0ec74a","Type":"ContainerStarted","Data":"6f77c1476f69c8b7c79b2dd55ee48c2135a9d2d9f2600f79ed325f4de9b4af8d"} Jan 21 15:49:00 crc kubenswrapper[4773]: I0121 15:49:00.633600 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6n97z" podStartSLOduration=3.057843728 podStartE2EDuration="16.633577528s" podCreationTimestamp="2026-01-21 15:48:44 +0000 UTC" firstStartedPulling="2026-01-21 15:48:45.839246866 +0000 UTC m=+1490.763736488" lastFinishedPulling="2026-01-21 15:48:59.414980666 +0000 UTC m=+1504.339470288" observedRunningTime="2026-01-21 15:49:00.625039025 +0000 UTC m=+1505.549528657" watchObservedRunningTime="2026-01-21 15:49:00.633577528 +0000 UTC m=+1505.558067150" Jan 21 15:49:01 crc kubenswrapper[4773]: I0121 15:49:01.623074 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d7b9329-7502-4e45-bf22-cfe4d7f5451b","Type":"ContainerStarted","Data":"21ffe573691e9170be3fed1a8d48c50b98ae676590f7eb0af6a3df73afa65c4b"} Jan 21 15:49:01 crc kubenswrapper[4773]: I0121 15:49:01.652576 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.652552654 podStartE2EDuration="11.652552654s" podCreationTimestamp="2026-01-21 15:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:01.647000763 +0000 UTC m=+1506.571490385" watchObservedRunningTime="2026-01-21 15:49:01.652552654 +0000 UTC m=+1506.577042276" Jan 21 15:49:02 crc kubenswrapper[4773]: I0121 15:49:02.636689 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a21025b-2c22-4719-9221-07159b0ec74a","Type":"ContainerStarted","Data":"84de43465a5e7533b6cdb9fa3715bea5fd4653bf3fa6d36541f6e9a268454312"} Jan 21 15:49:05 crc kubenswrapper[4773]: I0121 15:49:05.682530 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a21025b-2c22-4719-9221-07159b0ec74a","Type":"ContainerStarted","Data":"4e3a35b82ba358d8e42115ccbe8f5db2e6ea657f8a4e23c0eaf3211ff27a13ab"} Jan 21 15:49:11 crc kubenswrapper[4773]: I0121 15:49:11.186492 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 21 15:49:11 crc kubenswrapper[4773]: I0121 15:49:11.187018 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 21 15:49:11 crc kubenswrapper[4773]: I0121 15:49:11.220341 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 21 15:49:11 crc kubenswrapper[4773]: I0121 15:49:11.234208 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 21 15:49:11 crc kubenswrapper[4773]: I0121 15:49:11.739064 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 21 15:49:11 crc kubenswrapper[4773]: I0121 15:49:11.739240 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 21 15:49:13 crc kubenswrapper[4773]: I0121 15:49:13.756730 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:49:13 crc kubenswrapper[4773]: I0121 15:49:13.757109 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 15:49:14 crc kubenswrapper[4773]: I0121 15:49:14.270172 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 21 15:49:14 crc kubenswrapper[4773]: I0121 15:49:14.274543 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 21 15:49:15 crc kubenswrapper[4773]: I0121 15:49:15.268917 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="5c8c084f-2abc-435a-80fc-e8101b086e50" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.195:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:49:16 crc kubenswrapper[4773]: I0121 15:49:16.393853 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cloudkitty-api-0" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.204:8889/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:49:16 crc kubenswrapper[4773]: I0121 15:49:16.394211 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.204:8889/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:49:20 crc kubenswrapper[4773]: I0121 15:49:20.468320 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Jan 21 15:49:21 crc kubenswrapper[4773]: I0121 15:49:21.847640 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a21025b-2c22-4719-9221-07159b0ec74a","Type":"ContainerStarted","Data":"8c89fb4c1d02ee0aa2d2003bc6d04e25aa40db50b2f43505e55d246fe87a61f0"} Jan 21 15:49:23 crc kubenswrapper[4773]: I0121 15:49:23.871976 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a21025b-2c22-4719-9221-07159b0ec74a","Type":"ContainerStarted","Data":"5bdd85dc697b2937c2e23ccd300f3176666916962a7e9bc49160c21ec7ad7f12"} Jan 21 15:49:23 crc kubenswrapper[4773]: I0121 15:49:23.873453 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 15:49:23 crc kubenswrapper[4773]: I0121 15:49:23.906565 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.186015747 podStartE2EDuration="25.906537597s" podCreationTimestamp="2026-01-21 15:48:58 +0000 UTC" firstStartedPulling="2026-01-21 15:48:59.890425535 +0000 UTC m=+1504.814915157" lastFinishedPulling="2026-01-21 15:49:22.610947385 +0000 UTC m=+1527.535437007" observedRunningTime="2026-01-21 15:49:23.89162325 +0000 UTC m=+1528.816112872" watchObservedRunningTime="2026-01-21 15:49:23.906537597 +0000 UTC m=+1528.831027219" Jan 21 15:49:32 crc kubenswrapper[4773]: I0121 15:49:32.234455 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:32 crc kubenswrapper[4773]: I0121 15:49:32.235238 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="ceilometer-central-agent" containerID="cri-o://84de43465a5e7533b6cdb9fa3715bea5fd4653bf3fa6d36541f6e9a268454312" gracePeriod=30 Jan 21 15:49:32 crc kubenswrapper[4773]: I0121 15:49:32.235300 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="proxy-httpd" containerID="cri-o://5bdd85dc697b2937c2e23ccd300f3176666916962a7e9bc49160c21ec7ad7f12" gracePeriod=30 Jan 21 15:49:32 crc kubenswrapper[4773]: I0121 15:49:32.235348 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="sg-core" containerID="cri-o://8c89fb4c1d02ee0aa2d2003bc6d04e25aa40db50b2f43505e55d246fe87a61f0" gracePeriod=30 Jan 21 15:49:32 crc kubenswrapper[4773]: I0121 15:49:32.235384 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="ceilometer-notification-agent" containerID="cri-o://4e3a35b82ba358d8e42115ccbe8f5db2e6ea657f8a4e23c0eaf3211ff27a13ab" gracePeriod=30 Jan 21 15:49:32 crc kubenswrapper[4773]: I0121 15:49:32.245486 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.209:3000/\": EOF" Jan 21 15:49:32 crc kubenswrapper[4773]: I0121 15:49:32.961167 4773 generic.go:334] "Generic (PLEG): container finished" podID="3a21025b-2c22-4719-9221-07159b0ec74a" containerID="8c89fb4c1d02ee0aa2d2003bc6d04e25aa40db50b2f43505e55d246fe87a61f0" exitCode=2 Jan 21 15:49:32 crc kubenswrapper[4773]: I0121 15:49:32.961504 4773 generic.go:334] "Generic (PLEG): container finished" podID="3a21025b-2c22-4719-9221-07159b0ec74a" containerID="84de43465a5e7533b6cdb9fa3715bea5fd4653bf3fa6d36541f6e9a268454312" exitCode=0 Jan 21 15:49:32 crc kubenswrapper[4773]: I0121 15:49:32.961271 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a21025b-2c22-4719-9221-07159b0ec74a","Type":"ContainerDied","Data":"8c89fb4c1d02ee0aa2d2003bc6d04e25aa40db50b2f43505e55d246fe87a61f0"} Jan 21 15:49:32 crc kubenswrapper[4773]: I0121 15:49:32.961552 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a21025b-2c22-4719-9221-07159b0ec74a","Type":"ContainerDied","Data":"84de43465a5e7533b6cdb9fa3715bea5fd4653bf3fa6d36541f6e9a268454312"} Jan 21 15:49:33 crc kubenswrapper[4773]: I0121 15:49:33.973024 4773 generic.go:334] "Generic (PLEG): container finished" podID="3a21025b-2c22-4719-9221-07159b0ec74a" containerID="5bdd85dc697b2937c2e23ccd300f3176666916962a7e9bc49160c21ec7ad7f12" exitCode=0 Jan 21 15:49:33 crc kubenswrapper[4773]: I0121 15:49:33.973075 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a21025b-2c22-4719-9221-07159b0ec74a","Type":"ContainerDied","Data":"5bdd85dc697b2937c2e23ccd300f3176666916962a7e9bc49160c21ec7ad7f12"} Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.015002 4773 generic.go:334] "Generic (PLEG): container finished" podID="3a21025b-2c22-4719-9221-07159b0ec74a" containerID="4e3a35b82ba358d8e42115ccbe8f5db2e6ea657f8a4e23c0eaf3211ff27a13ab" exitCode=0 Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.015116 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a21025b-2c22-4719-9221-07159b0ec74a","Type":"ContainerDied","Data":"4e3a35b82ba358d8e42115ccbe8f5db2e6ea657f8a4e23c0eaf3211ff27a13ab"} Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.333424 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.438946 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-run-httpd\") pod \"3a21025b-2c22-4719-9221-07159b0ec74a\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.439039 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-scripts\") pod \"3a21025b-2c22-4719-9221-07159b0ec74a\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.439115 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-config-data\") pod \"3a21025b-2c22-4719-9221-07159b0ec74a\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.439953 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4694\" (UniqueName: \"kubernetes.io/projected/3a21025b-2c22-4719-9221-07159b0ec74a-kube-api-access-m4694\") pod \"3a21025b-2c22-4719-9221-07159b0ec74a\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.440007 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-sg-core-conf-yaml\") pod \"3a21025b-2c22-4719-9221-07159b0ec74a\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.440059 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-combined-ca-bundle\") pod \"3a21025b-2c22-4719-9221-07159b0ec74a\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.440090 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-log-httpd\") pod \"3a21025b-2c22-4719-9221-07159b0ec74a\" (UID: \"3a21025b-2c22-4719-9221-07159b0ec74a\") " Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.440136 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3a21025b-2c22-4719-9221-07159b0ec74a" (UID: "3a21025b-2c22-4719-9221-07159b0ec74a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.440638 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.441230 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3a21025b-2c22-4719-9221-07159b0ec74a" (UID: "3a21025b-2c22-4719-9221-07159b0ec74a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.458715 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-scripts" (OuterVolumeSpecName: "scripts") pod "3a21025b-2c22-4719-9221-07159b0ec74a" (UID: "3a21025b-2c22-4719-9221-07159b0ec74a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.461839 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a21025b-2c22-4719-9221-07159b0ec74a-kube-api-access-m4694" (OuterVolumeSpecName: "kube-api-access-m4694") pod "3a21025b-2c22-4719-9221-07159b0ec74a" (UID: "3a21025b-2c22-4719-9221-07159b0ec74a"). InnerVolumeSpecName "kube-api-access-m4694". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.484963 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3a21025b-2c22-4719-9221-07159b0ec74a" (UID: "3a21025b-2c22-4719-9221-07159b0ec74a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.540504 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a21025b-2c22-4719-9221-07159b0ec74a" (UID: "3a21025b-2c22-4719-9221-07159b0ec74a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.542852 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.542873 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4694\" (UniqueName: \"kubernetes.io/projected/3a21025b-2c22-4719-9221-07159b0ec74a-kube-api-access-m4694\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.542884 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.542894 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.542902 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a21025b-2c22-4719-9221-07159b0ec74a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.569096 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-config-data" (OuterVolumeSpecName: "config-data") pod "3a21025b-2c22-4719-9221-07159b0ec74a" (UID: "3a21025b-2c22-4719-9221-07159b0ec74a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:36 crc kubenswrapper[4773]: I0121 15:49:36.644589 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a21025b-2c22-4719-9221-07159b0ec74a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.030869 4773 generic.go:334] "Generic (PLEG): container finished" podID="b95f2a61-8fc4-4257-ad91-d0b45169dc09" containerID="b48d8918e0041b67b95045ce1886f07f41b253b7b11712c8b5045b0015b572d0" exitCode=0 Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.030960 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6n97z" event={"ID":"b95f2a61-8fc4-4257-ad91-d0b45169dc09","Type":"ContainerDied","Data":"b48d8918e0041b67b95045ce1886f07f41b253b7b11712c8b5045b0015b572d0"} Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.034127 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a21025b-2c22-4719-9221-07159b0ec74a","Type":"ContainerDied","Data":"6f77c1476f69c8b7c79b2dd55ee48c2135a9d2d9f2600f79ed325f4de9b4af8d"} Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.034193 4773 scope.go:117] "RemoveContainer" containerID="5bdd85dc697b2937c2e23ccd300f3176666916962a7e9bc49160c21ec7ad7f12" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.034353 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.092134 4773 scope.go:117] "RemoveContainer" containerID="8c89fb4c1d02ee0aa2d2003bc6d04e25aa40db50b2f43505e55d246fe87a61f0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.097062 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.120803 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.131050 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:37 crc kubenswrapper[4773]: E0121 15:49:37.131510 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="ceilometer-notification-agent" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.131526 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="ceilometer-notification-agent" Jan 21 15:49:37 crc kubenswrapper[4773]: E0121 15:49:37.131558 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="sg-core" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.131564 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="sg-core" Jan 21 15:49:37 crc kubenswrapper[4773]: E0121 15:49:37.131580 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="proxy-httpd" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.131586 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="proxy-httpd" Jan 21 15:49:37 crc kubenswrapper[4773]: E0121 15:49:37.131599 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="ceilometer-central-agent" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.131605 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="ceilometer-central-agent" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.131791 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="sg-core" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.131810 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="ceilometer-central-agent" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.131820 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="ceilometer-notification-agent" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.131835 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" containerName="proxy-httpd" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.133953 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.136191 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.136442 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.138636 4773 scope.go:117] "RemoveContainer" containerID="4e3a35b82ba358d8e42115ccbe8f5db2e6ea657f8a4e23c0eaf3211ff27a13ab" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.151363 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.183508 4773 scope.go:117] "RemoveContainer" containerID="84de43465a5e7533b6cdb9fa3715bea5fd4653bf3fa6d36541f6e9a268454312" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.257881 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwwht\" (UniqueName: \"kubernetes.io/projected/4ad976fa-167a-4810-9be8-59144083fe5b-kube-api-access-nwwht\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.257981 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.258228 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.258597 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-log-httpd\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.258676 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-config-data\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.258861 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-scripts\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.259062 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-run-httpd\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.361385 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.361926 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-log-httpd\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.361963 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-config-data\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.361999 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-scripts\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.362081 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-run-httpd\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.362150 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwwht\" (UniqueName: \"kubernetes.io/projected/4ad976fa-167a-4810-9be8-59144083fe5b-kube-api-access-nwwht\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.362217 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.362418 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-log-httpd\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.362637 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-run-httpd\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.367932 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.367955 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-scripts\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.368079 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.368507 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-config-data\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.378676 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwwht\" (UniqueName: \"kubernetes.io/projected/4ad976fa-167a-4810-9be8-59144083fe5b-kube-api-access-nwwht\") pod \"ceilometer-0\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.399339 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a21025b-2c22-4719-9221-07159b0ec74a" path="/var/lib/kubelet/pods/3a21025b-2c22-4719-9221-07159b0ec74a/volumes" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.461878 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:49:37 crc kubenswrapper[4773]: I0121 15:49:37.974515 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.047985 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ad976fa-167a-4810-9be8-59144083fe5b","Type":"ContainerStarted","Data":"d5200b494dfcce8b990d59ff18999d80e641b82ef9bc3a405dd16ebea567c917"} Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.639786 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.700284 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-config-data\") pod \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.700382 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-combined-ca-bundle\") pod \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.700446 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-scripts\") pod \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.700486 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhvwh\" (UniqueName: \"kubernetes.io/projected/b95f2a61-8fc4-4257-ad91-d0b45169dc09-kube-api-access-lhvwh\") pod \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\" (UID: \"b95f2a61-8fc4-4257-ad91-d0b45169dc09\") " Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.711412 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b95f2a61-8fc4-4257-ad91-d0b45169dc09-kube-api-access-lhvwh" (OuterVolumeSpecName: "kube-api-access-lhvwh") pod "b95f2a61-8fc4-4257-ad91-d0b45169dc09" (UID: "b95f2a61-8fc4-4257-ad91-d0b45169dc09"). InnerVolumeSpecName "kube-api-access-lhvwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.719105 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-scripts" (OuterVolumeSpecName: "scripts") pod "b95f2a61-8fc4-4257-ad91-d0b45169dc09" (UID: "b95f2a61-8fc4-4257-ad91-d0b45169dc09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.751821 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-config-data" (OuterVolumeSpecName: "config-data") pod "b95f2a61-8fc4-4257-ad91-d0b45169dc09" (UID: "b95f2a61-8fc4-4257-ad91-d0b45169dc09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.765034 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b95f2a61-8fc4-4257-ad91-d0b45169dc09" (UID: "b95f2a61-8fc4-4257-ad91-d0b45169dc09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.809929 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.810174 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.810256 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b95f2a61-8fc4-4257-ad91-d0b45169dc09-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:38 crc kubenswrapper[4773]: I0121 15:49:38.810366 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhvwh\" (UniqueName: \"kubernetes.io/projected/b95f2a61-8fc4-4257-ad91-d0b45169dc09-kube-api-access-lhvwh\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.005775 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.061219 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6n97z" event={"ID":"b95f2a61-8fc4-4257-ad91-d0b45169dc09","Type":"ContainerDied","Data":"dd46dae15a80c85aa6291fa9c4e72e7da3f1cc375007fb34cf6e2f96aecd3799"} Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.061265 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd46dae15a80c85aa6291fa9c4e72e7da3f1cc375007fb34cf6e2f96aecd3799" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.061342 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6n97z" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.159162 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 15:49:39 crc kubenswrapper[4773]: E0121 15:49:39.160085 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b95f2a61-8fc4-4257-ad91-d0b45169dc09" containerName="nova-cell0-conductor-db-sync" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.160116 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95f2a61-8fc4-4257-ad91-d0b45169dc09" containerName="nova-cell0-conductor-db-sync" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.160418 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b95f2a61-8fc4-4257-ad91-d0b45169dc09" containerName="nova-cell0-conductor-db-sync" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.161498 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.166544 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j2g5c" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.166828 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.170724 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.223234 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa487edb-f8c0-439e-af92-83c72235393e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"aa487edb-f8c0-439e-af92-83c72235393e\") " pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.223378 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa487edb-f8c0-439e-af92-83c72235393e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"aa487edb-f8c0-439e-af92-83c72235393e\") " pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.223464 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9vmv\" (UniqueName: \"kubernetes.io/projected/aa487edb-f8c0-439e-af92-83c72235393e-kube-api-access-q9vmv\") pod \"nova-cell0-conductor-0\" (UID: \"aa487edb-f8c0-439e-af92-83c72235393e\") " pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.324965 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa487edb-f8c0-439e-af92-83c72235393e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"aa487edb-f8c0-439e-af92-83c72235393e\") " pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.325079 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa487edb-f8c0-439e-af92-83c72235393e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"aa487edb-f8c0-439e-af92-83c72235393e\") " pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.325146 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9vmv\" (UniqueName: \"kubernetes.io/projected/aa487edb-f8c0-439e-af92-83c72235393e-kube-api-access-q9vmv\") pod \"nova-cell0-conductor-0\" (UID: \"aa487edb-f8c0-439e-af92-83c72235393e\") " pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.329806 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa487edb-f8c0-439e-af92-83c72235393e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"aa487edb-f8c0-439e-af92-83c72235393e\") " pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.329849 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa487edb-f8c0-439e-af92-83c72235393e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"aa487edb-f8c0-439e-af92-83c72235393e\") " pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.375496 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9vmv\" (UniqueName: \"kubernetes.io/projected/aa487edb-f8c0-439e-af92-83c72235393e-kube-api-access-q9vmv\") pod \"nova-cell0-conductor-0\" (UID: \"aa487edb-f8c0-439e-af92-83c72235393e\") " pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:39 crc kubenswrapper[4773]: I0121 15:49:39.500105 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:40 crc kubenswrapper[4773]: I0121 15:49:40.041414 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 15:49:40 crc kubenswrapper[4773]: W0121 15:49:40.060982 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa487edb_f8c0_439e_af92_83c72235393e.slice/crio-e7d279d54095129131d1e437b5e7a120409597c5db291c93af543759d06f3bbc WatchSource:0}: Error finding container e7d279d54095129131d1e437b5e7a120409597c5db291c93af543759d06f3bbc: Status 404 returned error can't find the container with id e7d279d54095129131d1e437b5e7a120409597c5db291c93af543759d06f3bbc Jan 21 15:49:40 crc kubenswrapper[4773]: I0121 15:49:40.083843 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"aa487edb-f8c0-439e-af92-83c72235393e","Type":"ContainerStarted","Data":"e7d279d54095129131d1e437b5e7a120409597c5db291c93af543759d06f3bbc"} Jan 21 15:49:40 crc kubenswrapper[4773]: I0121 15:49:40.089924 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ad976fa-167a-4810-9be8-59144083fe5b","Type":"ContainerStarted","Data":"af85c068de34a167ec78b7828382cc548a72139657ed074f4c80f80fdc5e3bfb"} Jan 21 15:49:41 crc kubenswrapper[4773]: I0121 15:49:41.102242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"aa487edb-f8c0-439e-af92-83c72235393e","Type":"ContainerStarted","Data":"5cae86f534ce354895bea6087d93a7b1cfb5bb959f195b9f97db93f4fce97196"} Jan 21 15:49:41 crc kubenswrapper[4773]: I0121 15:49:41.102598 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:41 crc kubenswrapper[4773]: I0121 15:49:41.105305 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ad976fa-167a-4810-9be8-59144083fe5b","Type":"ContainerStarted","Data":"57cf8caf877e0a309a7bc4f94ace397b71bb4574d97ff0ef185d300cb7c40feb"} Jan 21 15:49:41 crc kubenswrapper[4773]: I0121 15:49:41.128913 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.128887823 podStartE2EDuration="2.128887823s" podCreationTimestamp="2026-01-21 15:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:41.123567178 +0000 UTC m=+1546.048056800" watchObservedRunningTime="2026-01-21 15:49:41.128887823 +0000 UTC m=+1546.053377445" Jan 21 15:49:42 crc kubenswrapper[4773]: I0121 15:49:42.117815 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ad976fa-167a-4810-9be8-59144083fe5b","Type":"ContainerStarted","Data":"93cf2708082e8caf0dfe56c404b5db196a04b54f9c9b54bd63c98df7728122bd"} Jan 21 15:49:48 crc kubenswrapper[4773]: I0121 15:49:48.176034 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ad976fa-167a-4810-9be8-59144083fe5b","Type":"ContainerStarted","Data":"5240e1f81ee3532c8baa0e6abbfd543d782790cb19a1955ce46321378560da6c"} Jan 21 15:49:48 crc kubenswrapper[4773]: I0121 15:49:48.176573 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 15:49:48 crc kubenswrapper[4773]: I0121 15:49:48.176288 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="sg-core" containerID="cri-o://93cf2708082e8caf0dfe56c404b5db196a04b54f9c9b54bd63c98df7728122bd" gracePeriod=30 Jan 21 15:49:48 crc kubenswrapper[4773]: I0121 15:49:48.176185 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="ceilometer-central-agent" containerID="cri-o://af85c068de34a167ec78b7828382cc548a72139657ed074f4c80f80fdc5e3bfb" gracePeriod=30 Jan 21 15:49:48 crc kubenswrapper[4773]: I0121 15:49:48.176347 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="ceilometer-notification-agent" containerID="cri-o://57cf8caf877e0a309a7bc4f94ace397b71bb4574d97ff0ef185d300cb7c40feb" gracePeriod=30 Jan 21 15:49:48 crc kubenswrapper[4773]: I0121 15:49:48.176325 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="proxy-httpd" containerID="cri-o://5240e1f81ee3532c8baa0e6abbfd543d782790cb19a1955ce46321378560da6c" gracePeriod=30 Jan 21 15:49:48 crc kubenswrapper[4773]: I0121 15:49:48.206610 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.277158949 podStartE2EDuration="11.206592454s" podCreationTimestamp="2026-01-21 15:49:37 +0000 UTC" firstStartedPulling="2026-01-21 15:49:37.976784527 +0000 UTC m=+1542.901274149" lastFinishedPulling="2026-01-21 15:49:46.906218032 +0000 UTC m=+1551.830707654" observedRunningTime="2026-01-21 15:49:48.200615581 +0000 UTC m=+1553.125105223" watchObservedRunningTime="2026-01-21 15:49:48.206592454 +0000 UTC m=+1553.131082076" Jan 21 15:49:49 crc kubenswrapper[4773]: I0121 15:49:49.189117 4773 generic.go:334] "Generic (PLEG): container finished" podID="4ad976fa-167a-4810-9be8-59144083fe5b" containerID="5240e1f81ee3532c8baa0e6abbfd543d782790cb19a1955ce46321378560da6c" exitCode=0 Jan 21 15:49:49 crc kubenswrapper[4773]: I0121 15:49:49.189440 4773 generic.go:334] "Generic (PLEG): container finished" podID="4ad976fa-167a-4810-9be8-59144083fe5b" containerID="93cf2708082e8caf0dfe56c404b5db196a04b54f9c9b54bd63c98df7728122bd" exitCode=2 Jan 21 15:49:49 crc kubenswrapper[4773]: I0121 15:49:49.189459 4773 generic.go:334] "Generic (PLEG): container finished" podID="4ad976fa-167a-4810-9be8-59144083fe5b" containerID="57cf8caf877e0a309a7bc4f94ace397b71bb4574d97ff0ef185d300cb7c40feb" exitCode=0 Jan 21 15:49:49 crc kubenswrapper[4773]: I0121 15:49:49.189226 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ad976fa-167a-4810-9be8-59144083fe5b","Type":"ContainerDied","Data":"5240e1f81ee3532c8baa0e6abbfd543d782790cb19a1955ce46321378560da6c"} Jan 21 15:49:49 crc kubenswrapper[4773]: I0121 15:49:49.189515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ad976fa-167a-4810-9be8-59144083fe5b","Type":"ContainerDied","Data":"93cf2708082e8caf0dfe56c404b5db196a04b54f9c9b54bd63c98df7728122bd"} Jan 21 15:49:49 crc kubenswrapper[4773]: I0121 15:49:49.189539 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ad976fa-167a-4810-9be8-59144083fe5b","Type":"ContainerDied","Data":"57cf8caf877e0a309a7bc4f94ace397b71bb4574d97ff0ef185d300cb7c40feb"} Jan 21 15:49:49 crc kubenswrapper[4773]: I0121 15:49:49.553403 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.046390 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xhr58"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.048498 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.069778 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.070395 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.075521 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xhr58"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.153901 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-scripts\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.154075 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-config-data\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.154110 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.154168 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ktv\" (UniqueName: \"kubernetes.io/projected/da351801-59dd-48f7-a0cd-4b5466057278-kube-api-access-b9ktv\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.237323 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.239416 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.248400 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.253902 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.255514 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ktv\" (UniqueName: \"kubernetes.io/projected/da351801-59dd-48f7-a0cd-4b5466057278-kube-api-access-b9ktv\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.255588 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-scripts\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.255727 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-config-data\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.255751 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.284793 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-config-data\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.313653 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-scripts\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.313990 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ktv\" (UniqueName: \"kubernetes.io/projected/da351801-59dd-48f7-a0cd-4b5466057278-kube-api-access-b9ktv\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.314346 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xhr58\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.393417 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.460433 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.485515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmg8f\" (UniqueName: \"kubernetes.io/projected/ab9b66a6-8df5-445a-8f37-979886a1c43b-kube-api-access-hmg8f\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.485599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.485860 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-config-data\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.485909 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9b66a6-8df5-445a-8f37-979886a1c43b-logs\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.487482 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.502658 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.517035 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.541475 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.543180 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.547094 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.553230 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.583733 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.585223 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.591319 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-config-data\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.591586 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9b8n\" (UniqueName: \"kubernetes.io/projected/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-kube-api-access-d9b8n\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.591732 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmg8f\" (UniqueName: \"kubernetes.io/projected/ab9b66a6-8df5-445a-8f37-979886a1c43b-kube-api-access-hmg8f\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.593119 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.596710 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.599302 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-config-data\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.599417 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-logs\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.599504 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9b66a6-8df5-445a-8f37-979886a1c43b-logs\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.599629 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.606606 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9b66a6-8df5-445a-8f37-979886a1c43b-logs\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.607262 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.611241 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.613440 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmg8f\" (UniqueName: \"kubernetes.io/projected/ab9b66a6-8df5-445a-8f37-979886a1c43b-kube-api-access-hmg8f\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.624423 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-config-data\") pod \"nova-api-0\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.636754 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-fqhnk"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.638474 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.661763 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-fqhnk"] Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.708677 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-config\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.708743 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.708772 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8dhf\" (UniqueName: \"kubernetes.io/projected/38800db9-ae4b-47a9-938e-0264e1bb6680-kube-api-access-b8dhf\") pod \"nova-cell1-novncproxy-0\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.708867 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.708938 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlnng\" (UniqueName: \"kubernetes.io/projected/99ed7666-e174-4d86-931c-4c04712d5a26-kube-api-access-qlnng\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.709448 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-logs\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.709538 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.709831 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-svc\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.709924 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.709969 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-config-data\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.709986 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.710014 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-config-data\") pod \"nova-scheduler-0\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " pod="openstack/nova-scheduler-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.711076 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pkbm\" (UniqueName: \"kubernetes.io/projected/8cee6238-182b-4467-95ef-95cc4fbf5423-kube-api-access-6pkbm\") pod \"nova-scheduler-0\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " pod="openstack/nova-scheduler-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.711126 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.711173 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9b8n\" (UniqueName: \"kubernetes.io/projected/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-kube-api-access-d9b8n\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.711169 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-logs\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.711189 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " pod="openstack/nova-scheduler-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.719592 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.719958 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-config-data\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.737459 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9b8n\" (UniqueName: \"kubernetes.io/projected/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-kube-api-access-d9b8n\") pod \"nova-metadata-0\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " pod="openstack/nova-metadata-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.739495 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.815121 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlnng\" (UniqueName: \"kubernetes.io/projected/99ed7666-e174-4d86-931c-4c04712d5a26-kube-api-access-qlnng\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.816036 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-svc\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.816272 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.816455 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.816685 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-config-data\") pod \"nova-scheduler-0\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " pod="openstack/nova-scheduler-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.816876 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pkbm\" (UniqueName: \"kubernetes.io/projected/8cee6238-182b-4467-95ef-95cc4fbf5423-kube-api-access-6pkbm\") pod \"nova-scheduler-0\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " pod="openstack/nova-scheduler-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.817057 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.819197 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-svc\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.820354 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.824357 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " pod="openstack/nova-scheduler-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.824797 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-config\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.824950 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.825046 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8dhf\" (UniqueName: \"kubernetes.io/projected/38800db9-ae4b-47a9-938e-0264e1bb6680-kube-api-access-b8dhf\") pod \"nova-cell1-novncproxy-0\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.825245 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.826142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.826969 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.829624 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-config\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.831411 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " pod="openstack/nova-scheduler-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.833603 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.837373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-config-data\") pod \"nova-scheduler-0\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " pod="openstack/nova-scheduler-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.841091 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.841121 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlnng\" (UniqueName: \"kubernetes.io/projected/99ed7666-e174-4d86-931c-4c04712d5a26-kube-api-access-qlnng\") pod \"dnsmasq-dns-78cd565959-fqhnk\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.846453 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pkbm\" (UniqueName: \"kubernetes.io/projected/8cee6238-182b-4467-95ef-95cc4fbf5423-kube-api-access-6pkbm\") pod \"nova-scheduler-0\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " pod="openstack/nova-scheduler-0" Jan 21 15:49:50 crc kubenswrapper[4773]: I0121 15:49:50.863423 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8dhf\" (UniqueName: \"kubernetes.io/projected/38800db9-ae4b-47a9-938e-0264e1bb6680-kube-api-access-b8dhf\") pod \"nova-cell1-novncproxy-0\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.028287 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.043408 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.066434 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.081742 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.121059 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xhr58"] Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.253812 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xhr58" event={"ID":"da351801-59dd-48f7-a0cd-4b5466057278","Type":"ContainerStarted","Data":"4077ce1810e5ff143b9d077878c921b803413c25ef7fe0144ab12869ca44adbe"} Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.444847 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:49:51 crc kubenswrapper[4773]: W0121 15:49:51.467733 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab9b66a6_8df5_445a_8f37_979886a1c43b.slice/crio-d3f046162a06337f799382669f569f103421a627e20ab5f89905c016c50937df WatchSource:0}: Error finding container d3f046162a06337f799382669f569f103421a627e20ab5f89905c016c50937df: Status 404 returned error can't find the container with id d3f046162a06337f799382669f569f103421a627e20ab5f89905c016c50937df Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.702645 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-djq9w"] Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.719426 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.720854 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-djq9w"] Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.727039 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.727048 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.750148 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.750245 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdhbv\" (UniqueName: \"kubernetes.io/projected/815b33e5-5403-49d5-941d-8dc85c57a336-kube-api-access-zdhbv\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.750377 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-scripts\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.750633 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-config-data\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.853632 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-config-data\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.854073 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.854153 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdhbv\" (UniqueName: \"kubernetes.io/projected/815b33e5-5403-49d5-941d-8dc85c57a336-kube-api-access-zdhbv\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.854286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-scripts\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.863334 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-scripts\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.866909 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-config-data\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.866982 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.888376 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.898499 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:51 crc kubenswrapper[4773]: I0121 15:49:51.918753 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdhbv\" (UniqueName: \"kubernetes.io/projected/815b33e5-5403-49d5-941d-8dc85c57a336-kube-api-access-zdhbv\") pod \"nova-cell1-conductor-db-sync-djq9w\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:52 crc kubenswrapper[4773]: I0121 15:49:52.054260 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:49:52 crc kubenswrapper[4773]: I0121 15:49:52.091247 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-fqhnk"] Jan 21 15:49:52 crc kubenswrapper[4773]: I0121 15:49:52.127115 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 15:49:52 crc kubenswrapper[4773]: I0121 15:49:52.277871 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b6ff2b6-acc2-4a01-982b-ac9be36605bd","Type":"ContainerStarted","Data":"0b6b975444887a577f8175346d93b756540951e1d2841fc1bd81127d0b1c5d3e"} Jan 21 15:49:52 crc kubenswrapper[4773]: I0121 15:49:52.283527 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab9b66a6-8df5-445a-8f37-979886a1c43b","Type":"ContainerStarted","Data":"d3f046162a06337f799382669f569f103421a627e20ab5f89905c016c50937df"} Jan 21 15:49:52 crc kubenswrapper[4773]: I0121 15:49:52.290485 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38800db9-ae4b-47a9-938e-0264e1bb6680","Type":"ContainerStarted","Data":"379338dac5b1ffa11e17d0535f6c0bb1bdbac103c40cf49633435f3d9fd23185"} Jan 21 15:49:52 crc kubenswrapper[4773]: I0121 15:49:52.294398 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" event={"ID":"99ed7666-e174-4d86-931c-4c04712d5a26","Type":"ContainerStarted","Data":"ef3192fd6f1558b8b7b1b46a80dca8de4ff96c6029c64274c43ab78d31c1952e"} Jan 21 15:49:52 crc kubenswrapper[4773]: I0121 15:49:52.298633 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8cee6238-182b-4467-95ef-95cc4fbf5423","Type":"ContainerStarted","Data":"bc71773b1a9a320da0549a4c0935ebf4c50970267a56b2b72d56af93abf3062b"} Jan 21 15:49:52 crc kubenswrapper[4773]: I0121 15:49:52.303640 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xhr58" event={"ID":"da351801-59dd-48f7-a0cd-4b5466057278","Type":"ContainerStarted","Data":"f4607388de82d4843137cf8a6d25aa39d8e8cc3f85e0dfd6d7c406f904ac2d29"} Jan 21 15:49:52 crc kubenswrapper[4773]: I0121 15:49:52.332986 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xhr58" podStartSLOduration=2.332930115 podStartE2EDuration="2.332930115s" podCreationTimestamp="2026-01-21 15:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:52.322091461 +0000 UTC m=+1557.246581083" watchObservedRunningTime="2026-01-21 15:49:52.332930115 +0000 UTC m=+1557.257419747" Jan 21 15:49:52 crc kubenswrapper[4773]: I0121 15:49:52.797462 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-djq9w"] Jan 21 15:49:52 crc kubenswrapper[4773]: W0121 15:49:52.816248 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod815b33e5_5403_49d5_941d_8dc85c57a336.slice/crio-7322d9baf703d27b6930c53c14f481f9075f8fe600f2a1af98b5bec8886116f0 WatchSource:0}: Error finding container 7322d9baf703d27b6930c53c14f481f9075f8fe600f2a1af98b5bec8886116f0: Status 404 returned error can't find the container with id 7322d9baf703d27b6930c53c14f481f9075f8fe600f2a1af98b5bec8886116f0 Jan 21 15:49:53 crc kubenswrapper[4773]: I0121 15:49:53.328325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-djq9w" event={"ID":"815b33e5-5403-49d5-941d-8dc85c57a336","Type":"ContainerStarted","Data":"078c10f544cb57265747dcccf59de4ec4af518ed1875b8edf3aa3cce8a6c25fb"} Jan 21 15:49:53 crc kubenswrapper[4773]: I0121 15:49:53.329112 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-djq9w" event={"ID":"815b33e5-5403-49d5-941d-8dc85c57a336","Type":"ContainerStarted","Data":"7322d9baf703d27b6930c53c14f481f9075f8fe600f2a1af98b5bec8886116f0"} Jan 21 15:49:53 crc kubenswrapper[4773]: I0121 15:49:53.355014 4773 generic.go:334] "Generic (PLEG): container finished" podID="99ed7666-e174-4d86-931c-4c04712d5a26" containerID="5e804b1934bf64fffb517df2a3db9feb381dada3624af6f6d25e242824094239" exitCode=0 Jan 21 15:49:53 crc kubenswrapper[4773]: I0121 15:49:53.356449 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" event={"ID":"99ed7666-e174-4d86-931c-4c04712d5a26","Type":"ContainerDied","Data":"5e804b1934bf64fffb517df2a3db9feb381dada3624af6f6d25e242824094239"} Jan 21 15:49:53 crc kubenswrapper[4773]: I0121 15:49:53.383321 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-djq9w" podStartSLOduration=2.383294879 podStartE2EDuration="2.383294879s" podCreationTimestamp="2026-01-21 15:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:53.35474307 +0000 UTC m=+1558.279232712" watchObservedRunningTime="2026-01-21 15:49:53.383294879 +0000 UTC m=+1558.307784501" Jan 21 15:49:54 crc kubenswrapper[4773]: I0121 15:49:54.266854 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:49:54 crc kubenswrapper[4773]: I0121 15:49:54.281494 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 15:49:54 crc kubenswrapper[4773]: I0121 15:49:54.371870 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" event={"ID":"99ed7666-e174-4d86-931c-4c04712d5a26","Type":"ContainerStarted","Data":"e467aab9eb9f012dc0c49a73e1deb2c634e2e38ffe3f84433aa6cb84228eefcc"} Jan 21 15:49:54 crc kubenswrapper[4773]: I0121 15:49:54.424986 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" podStartSLOduration=4.424959384 podStartE2EDuration="4.424959384s" podCreationTimestamp="2026-01-21 15:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:49:54.424177113 +0000 UTC m=+1559.348666735" watchObservedRunningTime="2026-01-21 15:49:54.424959384 +0000 UTC m=+1559.349449006" Jan 21 15:49:55 crc kubenswrapper[4773]: I0121 15:49:55.207788 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:49:55 crc kubenswrapper[4773]: I0121 15:49:55.208148 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:49:55 crc kubenswrapper[4773]: I0121 15:49:55.416399 4773 generic.go:334] "Generic (PLEG): container finished" podID="4ad976fa-167a-4810-9be8-59144083fe5b" containerID="af85c068de34a167ec78b7828382cc548a72139657ed074f4c80f80fdc5e3bfb" exitCode=0 Jan 21 15:49:55 crc kubenswrapper[4773]: I0121 15:49:55.417034 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ad976fa-167a-4810-9be8-59144083fe5b","Type":"ContainerDied","Data":"af85c068de34a167ec78b7828382cc548a72139657ed074f4c80f80fdc5e3bfb"} Jan 21 15:49:55 crc kubenswrapper[4773]: I0121 15:49:55.417177 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.440167 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ad976fa-167a-4810-9be8-59144083fe5b","Type":"ContainerDied","Data":"d5200b494dfcce8b990d59ff18999d80e641b82ef9bc3a405dd16ebea567c917"} Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.440663 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5200b494dfcce8b990d59ff18999d80e641b82ef9bc3a405dd16ebea567c917" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.577399 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.614878 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-log-httpd\") pod \"4ad976fa-167a-4810-9be8-59144083fe5b\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.614958 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-config-data\") pod \"4ad976fa-167a-4810-9be8-59144083fe5b\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.615007 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwwht\" (UniqueName: \"kubernetes.io/projected/4ad976fa-167a-4810-9be8-59144083fe5b-kube-api-access-nwwht\") pod \"4ad976fa-167a-4810-9be8-59144083fe5b\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.615059 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-sg-core-conf-yaml\") pod \"4ad976fa-167a-4810-9be8-59144083fe5b\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.615126 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-combined-ca-bundle\") pod \"4ad976fa-167a-4810-9be8-59144083fe5b\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.615229 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-run-httpd\") pod \"4ad976fa-167a-4810-9be8-59144083fe5b\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.615246 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-scripts\") pod \"4ad976fa-167a-4810-9be8-59144083fe5b\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.615455 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4ad976fa-167a-4810-9be8-59144083fe5b" (UID: "4ad976fa-167a-4810-9be8-59144083fe5b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.615750 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.616998 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4ad976fa-167a-4810-9be8-59144083fe5b" (UID: "4ad976fa-167a-4810-9be8-59144083fe5b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.638957 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad976fa-167a-4810-9be8-59144083fe5b-kube-api-access-nwwht" (OuterVolumeSpecName: "kube-api-access-nwwht") pod "4ad976fa-167a-4810-9be8-59144083fe5b" (UID: "4ad976fa-167a-4810-9be8-59144083fe5b"). InnerVolumeSpecName "kube-api-access-nwwht". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.644927 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-scripts" (OuterVolumeSpecName: "scripts") pod "4ad976fa-167a-4810-9be8-59144083fe5b" (UID: "4ad976fa-167a-4810-9be8-59144083fe5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.707787 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4ad976fa-167a-4810-9be8-59144083fe5b" (UID: "4ad976fa-167a-4810-9be8-59144083fe5b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.718470 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ad976fa-167a-4810-9be8-59144083fe5b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.718500 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.718515 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwwht\" (UniqueName: \"kubernetes.io/projected/4ad976fa-167a-4810-9be8-59144083fe5b-kube-api-access-nwwht\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.718526 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.792362 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ad976fa-167a-4810-9be8-59144083fe5b" (UID: "4ad976fa-167a-4810-9be8-59144083fe5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.818902 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-config-data" (OuterVolumeSpecName: "config-data") pod "4ad976fa-167a-4810-9be8-59144083fe5b" (UID: "4ad976fa-167a-4810-9be8-59144083fe5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.819584 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-config-data\") pod \"4ad976fa-167a-4810-9be8-59144083fe5b\" (UID: \"4ad976fa-167a-4810-9be8-59144083fe5b\") " Jan 21 15:49:57 crc kubenswrapper[4773]: W0121 15:49:57.819840 4773 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4ad976fa-167a-4810-9be8-59144083fe5b/volumes/kubernetes.io~secret/config-data Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.819866 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-config-data" (OuterVolumeSpecName: "config-data") pod "4ad976fa-167a-4810-9be8-59144083fe5b" (UID: "4ad976fa-167a-4810-9be8-59144083fe5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.820263 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:57 crc kubenswrapper[4773]: I0121 15:49:57.820288 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ad976fa-167a-4810-9be8-59144083fe5b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.452042 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab9b66a6-8df5-445a-8f37-979886a1c43b","Type":"ContainerStarted","Data":"3ccb5e3e26b1edc575e82b461d928f2e95810e0a17aedc5bd36ad1cbeeef3669"} Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.453424 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab9b66a6-8df5-445a-8f37-979886a1c43b","Type":"ContainerStarted","Data":"99e3d0bde5724a4d3676536f67cb2ad6aeccc2739adf146d699b60ebcd16f59f"} Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.454364 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38800db9-ae4b-47a9-938e-0264e1bb6680","Type":"ContainerStarted","Data":"b3b55e1a46b45ab91a3b139ca594b5f79f61b6299698b7bea5009b9d5ead262b"} Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.454428 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="38800db9-ae4b-47a9-938e-0264e1bb6680" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b3b55e1a46b45ab91a3b139ca594b5f79f61b6299698b7bea5009b9d5ead262b" gracePeriod=30 Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.456232 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8cee6238-182b-4467-95ef-95cc4fbf5423","Type":"ContainerStarted","Data":"061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e"} Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.458604 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.458644 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2b6ff2b6-acc2-4a01-982b-ac9be36605bd" containerName="nova-metadata-log" containerID="cri-o://456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083" gracePeriod=30 Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.458685 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2b6ff2b6-acc2-4a01-982b-ac9be36605bd" containerName="nova-metadata-metadata" containerID="cri-o://bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7" gracePeriod=30 Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.458596 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b6ff2b6-acc2-4a01-982b-ac9be36605bd","Type":"ContainerStarted","Data":"bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7"} Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.458875 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b6ff2b6-acc2-4a01-982b-ac9be36605bd","Type":"ContainerStarted","Data":"456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083"} Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.476832 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.759274236 podStartE2EDuration="8.476817254s" podCreationTimestamp="2026-01-21 15:49:50 +0000 UTC" firstStartedPulling="2026-01-21 15:49:51.487237316 +0000 UTC m=+1556.411726938" lastFinishedPulling="2026-01-21 15:49:57.204780334 +0000 UTC m=+1562.129269956" observedRunningTime="2026-01-21 15:49:58.473968706 +0000 UTC m=+1563.398458328" watchObservedRunningTime="2026-01-21 15:49:58.476817254 +0000 UTC m=+1563.401306876" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.501423 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.142548322 podStartE2EDuration="8.501401115s" podCreationTimestamp="2026-01-21 15:49:50 +0000 UTC" firstStartedPulling="2026-01-21 15:49:51.850260119 +0000 UTC m=+1556.774749741" lastFinishedPulling="2026-01-21 15:49:57.209112912 +0000 UTC m=+1562.133602534" observedRunningTime="2026-01-21 15:49:58.495824972 +0000 UTC m=+1563.420314604" watchObservedRunningTime="2026-01-21 15:49:58.501401115 +0000 UTC m=+1563.425890757" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.526873 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.104789511 podStartE2EDuration="8.526852849s" podCreationTimestamp="2026-01-21 15:49:50 +0000 UTC" firstStartedPulling="2026-01-21 15:49:51.84517511 +0000 UTC m=+1556.769664722" lastFinishedPulling="2026-01-21 15:49:57.267238438 +0000 UTC m=+1562.191728060" observedRunningTime="2026-01-21 15:49:58.515014846 +0000 UTC m=+1563.439504478" watchObservedRunningTime="2026-01-21 15:49:58.526852849 +0000 UTC m=+1563.451342471" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.543708 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.495179822 podStartE2EDuration="8.543674468s" podCreationTimestamp="2026-01-21 15:49:50 +0000 UTC" firstStartedPulling="2026-01-21 15:49:52.152859264 +0000 UTC m=+1557.077348896" lastFinishedPulling="2026-01-21 15:49:57.20135392 +0000 UTC m=+1562.125843542" observedRunningTime="2026-01-21 15:49:58.533829749 +0000 UTC m=+1563.458319381" watchObservedRunningTime="2026-01-21 15:49:58.543674468 +0000 UTC m=+1563.468164090" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.559783 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.592492 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.609749 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:58 crc kubenswrapper[4773]: E0121 15:49:58.610183 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="sg-core" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.610201 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="sg-core" Jan 21 15:49:58 crc kubenswrapper[4773]: E0121 15:49:58.610216 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="ceilometer-central-agent" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.610223 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="ceilometer-central-agent" Jan 21 15:49:58 crc kubenswrapper[4773]: E0121 15:49:58.610240 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="ceilometer-notification-agent" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.610246 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="ceilometer-notification-agent" Jan 21 15:49:58 crc kubenswrapper[4773]: E0121 15:49:58.610272 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="proxy-httpd" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.610278 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="proxy-httpd" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.610469 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="ceilometer-central-agent" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.610480 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="proxy-httpd" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.610504 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="sg-core" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.610519 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" containerName="ceilometer-notification-agent" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.612413 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.620629 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.622120 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.622365 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.639132 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-scripts\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.639225 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.639303 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl4sn\" (UniqueName: \"kubernetes.io/projected/9e315c04-30b5-402f-8863-6612cb639a19-kube-api-access-jl4sn\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.639425 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-config-data\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.639488 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.639581 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-log-httpd\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.639607 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-run-httpd\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.676034 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-94x76"] Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.681137 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94x76" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.688672 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94x76"] Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.745540 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.745720 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl4sn\" (UniqueName: \"kubernetes.io/projected/9e315c04-30b5-402f-8863-6612cb639a19-kube-api-access-jl4sn\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.745776 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-catalog-content\") pod \"community-operators-94x76\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " pod="openshift-marketplace/community-operators-94x76" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.745902 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7nc9\" (UniqueName: \"kubernetes.io/projected/59e2db42-e430-4607-82c7-1676b3906dbb-kube-api-access-q7nc9\") pod \"community-operators-94x76\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " pod="openshift-marketplace/community-operators-94x76" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.745981 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-config-data\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.746074 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.746517 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-log-httpd\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.746552 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-run-httpd\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.746585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-utilities\") pod \"community-operators-94x76\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " pod="openshift-marketplace/community-operators-94x76" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.746674 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-scripts\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.749006 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-log-httpd\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.747806 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-run-httpd\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.754766 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.755161 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-scripts\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.756256 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.763328 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-config-data\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.772679 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl4sn\" (UniqueName: \"kubernetes.io/projected/9e315c04-30b5-402f-8863-6612cb639a19-kube-api-access-jl4sn\") pod \"ceilometer-0\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.848542 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-utilities\") pod \"community-operators-94x76\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " pod="openshift-marketplace/community-operators-94x76" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.848747 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-catalog-content\") pod \"community-operators-94x76\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " pod="openshift-marketplace/community-operators-94x76" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.848828 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7nc9\" (UniqueName: \"kubernetes.io/projected/59e2db42-e430-4607-82c7-1676b3906dbb-kube-api-access-q7nc9\") pod \"community-operators-94x76\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " pod="openshift-marketplace/community-operators-94x76" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.849932 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-utilities\") pod \"community-operators-94x76\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " pod="openshift-marketplace/community-operators-94x76" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.850222 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-catalog-content\") pod \"community-operators-94x76\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " pod="openshift-marketplace/community-operators-94x76" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.870023 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7nc9\" (UniqueName: \"kubernetes.io/projected/59e2db42-e430-4607-82c7-1676b3906dbb-kube-api-access-q7nc9\") pod \"community-operators-94x76\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " pod="openshift-marketplace/community-operators-94x76" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.887534 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:49:58 crc kubenswrapper[4773]: I0121 15:49:58.902128 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94x76" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.404142 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ad976fa-167a-4810-9be8-59144083fe5b" path="/var/lib/kubelet/pods/4ad976fa-167a-4810-9be8-59144083fe5b/volumes" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.463853 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.478474 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-combined-ca-bundle\") pod \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.478593 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9b8n\" (UniqueName: \"kubernetes.io/projected/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-kube-api-access-d9b8n\") pod \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.478629 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-config-data\") pod \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.478864 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-logs\") pod \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\" (UID: \"2b6ff2b6-acc2-4a01-982b-ac9be36605bd\") " Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.479469 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-logs" (OuterVolumeSpecName: "logs") pod "2b6ff2b6-acc2-4a01-982b-ac9be36605bd" (UID: "2b6ff2b6-acc2-4a01-982b-ac9be36605bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.486300 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.495923 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-kube-api-access-d9b8n" (OuterVolumeSpecName: "kube-api-access-d9b8n") pod "2b6ff2b6-acc2-4a01-982b-ac9be36605bd" (UID: "2b6ff2b6-acc2-4a01-982b-ac9be36605bd"). InnerVolumeSpecName "kube-api-access-d9b8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.504884 4773 generic.go:334] "Generic (PLEG): container finished" podID="2b6ff2b6-acc2-4a01-982b-ac9be36605bd" containerID="bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7" exitCode=0 Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.504932 4773 generic.go:334] "Generic (PLEG): container finished" podID="2b6ff2b6-acc2-4a01-982b-ac9be36605bd" containerID="456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083" exitCode=143 Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.505018 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.505065 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b6ff2b6-acc2-4a01-982b-ac9be36605bd","Type":"ContainerDied","Data":"bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7"} Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.505090 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b6ff2b6-acc2-4a01-982b-ac9be36605bd","Type":"ContainerDied","Data":"456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083"} Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.505102 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b6ff2b6-acc2-4a01-982b-ac9be36605bd","Type":"ContainerDied","Data":"0b6b975444887a577f8175346d93b756540951e1d2841fc1bd81127d0b1c5d3e"} Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.505116 4773 scope.go:117] "RemoveContainer" containerID="bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.522863 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b6ff2b6-acc2-4a01-982b-ac9be36605bd" (UID: "2b6ff2b6-acc2-4a01-982b-ac9be36605bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.573454 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-config-data" (OuterVolumeSpecName: "config-data") pod "2b6ff2b6-acc2-4a01-982b-ac9be36605bd" (UID: "2b6ff2b6-acc2-4a01-982b-ac9be36605bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.575301 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.649443 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.649485 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9b8n\" (UniqueName: \"kubernetes.io/projected/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-kube-api-access-d9b8n\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.649501 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ff2b6-acc2-4a01-982b-ac9be36605bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.689008 4773 scope.go:117] "RemoveContainer" containerID="456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.715080 4773 scope.go:117] "RemoveContainer" containerID="bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7" Jan 21 15:49:59 crc kubenswrapper[4773]: E0121 15:49:59.718873 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7\": container with ID starting with bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7 not found: ID does not exist" containerID="bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.718931 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7"} err="failed to get container status \"bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7\": rpc error: code = NotFound desc = could not find container \"bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7\": container with ID starting with bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7 not found: ID does not exist" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.718965 4773 scope.go:117] "RemoveContainer" containerID="456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083" Jan 21 15:49:59 crc kubenswrapper[4773]: E0121 15:49:59.719831 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083\": container with ID starting with 456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083 not found: ID does not exist" containerID="456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.719859 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083"} err="failed to get container status \"456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083\": rpc error: code = NotFound desc = could not find container \"456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083\": container with ID starting with 456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083 not found: ID does not exist" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.719878 4773 scope.go:117] "RemoveContainer" containerID="bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.720264 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7"} err="failed to get container status \"bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7\": rpc error: code = NotFound desc = could not find container \"bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7\": container with ID starting with bb4d7a96038392914fca8a2aa4c869e87171a35a6147f478b913bb1ed2c25df7 not found: ID does not exist" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.720287 4773 scope.go:117] "RemoveContainer" containerID="456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.720628 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083"} err="failed to get container status \"456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083\": rpc error: code = NotFound desc = could not find container \"456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083\": container with ID starting with 456208f3852ce762bf8fc4601114ff262fafb9dd6da9957e1d14204e65730083 not found: ID does not exist" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.847267 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.870744 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.887989 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94x76"] Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.904769 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:49:59 crc kubenswrapper[4773]: E0121 15:49:59.905334 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6ff2b6-acc2-4a01-982b-ac9be36605bd" containerName="nova-metadata-metadata" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.905355 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ff2b6-acc2-4a01-982b-ac9be36605bd" containerName="nova-metadata-metadata" Jan 21 15:49:59 crc kubenswrapper[4773]: E0121 15:49:59.905389 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6ff2b6-acc2-4a01-982b-ac9be36605bd" containerName="nova-metadata-log" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.905394 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ff2b6-acc2-4a01-982b-ac9be36605bd" containerName="nova-metadata-log" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.905600 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6ff2b6-acc2-4a01-982b-ac9be36605bd" containerName="nova-metadata-log" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.905626 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6ff2b6-acc2-4a01-982b-ac9be36605bd" containerName="nova-metadata-metadata" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.906873 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.912084 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.912250 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 15:49:59 crc kubenswrapper[4773]: I0121 15:49:59.920411 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.059675 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.059998 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-config-data\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.060019 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8hf\" (UniqueName: \"kubernetes.io/projected/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-kube-api-access-zq8hf\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.060046 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-logs\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.060073 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.161958 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-config-data\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.162003 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8hf\" (UniqueName: \"kubernetes.io/projected/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-kube-api-access-zq8hf\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.162047 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-logs\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.162083 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.162264 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.162819 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-logs\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.169329 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.172159 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-config-data\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.177165 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.179283 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8hf\" (UniqueName: \"kubernetes.io/projected/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-kube-api-access-zq8hf\") pod \"nova-metadata-0\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.242611 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.522654 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e315c04-30b5-402f-8863-6612cb639a19","Type":"ContainerStarted","Data":"0a2e38418106c05f2878c31c9c1cef080dc0899940a425a29f476e202064300c"} Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.525941 4773 generic.go:334] "Generic (PLEG): container finished" podID="59e2db42-e430-4607-82c7-1676b3906dbb" containerID="b93cb23170d82852e45d9962750482eb9b55096c7cc7ef80573d94cfbf0156e7" exitCode=0 Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.526073 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94x76" event={"ID":"59e2db42-e430-4607-82c7-1676b3906dbb","Type":"ContainerDied","Data":"b93cb23170d82852e45d9962750482eb9b55096c7cc7ef80573d94cfbf0156e7"} Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.526103 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94x76" event={"ID":"59e2db42-e430-4607-82c7-1676b3906dbb","Type":"ContainerStarted","Data":"8839c5620e32f049bdc764591201b63b723a2ef3c5bd42921e21f4fec26a44e2"} Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.727984 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.739890 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 15:50:00 crc kubenswrapper[4773]: I0121 15:50:00.739959 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.044449 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.044662 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.067779 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.078199 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.084645 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.168683 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-k4dvm"] Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.168932 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" podUID="65527bca-7849-47c3-ad54-11916b724542" containerName="dnsmasq-dns" containerID="cri-o://118ac53ee6f0e2da22020764297cba6d99a16fca95123276676f78eae861883d" gracePeriod=10 Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.402204 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6ff2b6-acc2-4a01-982b-ac9be36605bd" path="/var/lib/kubelet/pods/2b6ff2b6-acc2-4a01-982b-ac9be36605bd/volumes" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.545220 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e315c04-30b5-402f-8863-6612cb639a19","Type":"ContainerStarted","Data":"c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544"} Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.547314 4773 generic.go:334] "Generic (PLEG): container finished" podID="65527bca-7849-47c3-ad54-11916b724542" containerID="118ac53ee6f0e2da22020764297cba6d99a16fca95123276676f78eae861883d" exitCode=0 Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.547393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" event={"ID":"65527bca-7849-47c3-ad54-11916b724542","Type":"ContainerDied","Data":"118ac53ee6f0e2da22020764297cba6d99a16fca95123276676f78eae861883d"} Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.552605 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c","Type":"ContainerStarted","Data":"354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b"} Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.552637 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c","Type":"ContainerStarted","Data":"cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40"} Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.552646 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c","Type":"ContainerStarted","Data":"9d08957cbb274321a658926beef3de02a37ae6f19b0a0ee249cca81170db99bc"} Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.554935 4773 generic.go:334] "Generic (PLEG): container finished" podID="da351801-59dd-48f7-a0cd-4b5466057278" containerID="f4607388de82d4843137cf8a6d25aa39d8e8cc3f85e0dfd6d7c406f904ac2d29" exitCode=0 Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.555061 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xhr58" event={"ID":"da351801-59dd-48f7-a0cd-4b5466057278","Type":"ContainerDied","Data":"f4607388de82d4843137cf8a6d25aa39d8e8cc3f85e0dfd6d7c406f904ac2d29"} Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.606345 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.618839 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.618816884 podStartE2EDuration="2.618816884s" podCreationTimestamp="2026-01-21 15:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:01.586072191 +0000 UTC m=+1566.510561813" watchObservedRunningTime="2026-01-21 15:50:01.618816884 +0000 UTC m=+1566.543306526" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.640946 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v87vc"] Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.643344 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.688448 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v87vc"] Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.720867 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-utilities\") pod \"certified-operators-v87vc\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.720997 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jbq9\" (UniqueName: \"kubernetes.io/projected/13ca882b-0344-4227-bcb8-92d3845c0385-kube-api-access-2jbq9\") pod \"certified-operators-v87vc\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.721122 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-catalog-content\") pod \"certified-operators-v87vc\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.781007 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.822071 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.824644 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-utilities\") pod \"certified-operators-v87vc\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.824807 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jbq9\" (UniqueName: \"kubernetes.io/projected/13ca882b-0344-4227-bcb8-92d3845c0385-kube-api-access-2jbq9\") pod \"certified-operators-v87vc\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.824917 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-catalog-content\") pod \"certified-operators-v87vc\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.825930 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-catalog-content\") pod \"certified-operators-v87vc\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.826929 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-utilities\") pod \"certified-operators-v87vc\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.843998 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jbq9\" (UniqueName: \"kubernetes.io/projected/13ca882b-0344-4227-bcb8-92d3845c0385-kube-api-access-2jbq9\") pod \"certified-operators-v87vc\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:01 crc kubenswrapper[4773]: I0121 15:50:01.991320 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.001372 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.131325 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-svc\") pod \"65527bca-7849-47c3-ad54-11916b724542\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.131538 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-config\") pod \"65527bca-7849-47c3-ad54-11916b724542\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.131590 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-swift-storage-0\") pod \"65527bca-7849-47c3-ad54-11916b724542\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.131630 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zfbz\" (UniqueName: \"kubernetes.io/projected/65527bca-7849-47c3-ad54-11916b724542-kube-api-access-8zfbz\") pod \"65527bca-7849-47c3-ad54-11916b724542\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.131685 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-nb\") pod \"65527bca-7849-47c3-ad54-11916b724542\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.131790 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-sb\") pod \"65527bca-7849-47c3-ad54-11916b724542\" (UID: \"65527bca-7849-47c3-ad54-11916b724542\") " Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.164427 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65527bca-7849-47c3-ad54-11916b724542-kube-api-access-8zfbz" (OuterVolumeSpecName: "kube-api-access-8zfbz") pod "65527bca-7849-47c3-ad54-11916b724542" (UID: "65527bca-7849-47c3-ad54-11916b724542"). InnerVolumeSpecName "kube-api-access-8zfbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.234460 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zfbz\" (UniqueName: \"kubernetes.io/projected/65527bca-7849-47c3-ad54-11916b724542-kube-api-access-8zfbz\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.252920 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "65527bca-7849-47c3-ad54-11916b724542" (UID: "65527bca-7849-47c3-ad54-11916b724542"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.256912 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "65527bca-7849-47c3-ad54-11916b724542" (UID: "65527bca-7849-47c3-ad54-11916b724542"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.257386 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "65527bca-7849-47c3-ad54-11916b724542" (UID: "65527bca-7849-47c3-ad54-11916b724542"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.273630 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-config" (OuterVolumeSpecName: "config") pod "65527bca-7849-47c3-ad54-11916b724542" (UID: "65527bca-7849-47c3-ad54-11916b724542"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.297147 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "65527bca-7849-47c3-ad54-11916b724542" (UID: "65527bca-7849-47c3-ad54-11916b724542"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.338185 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.338222 4773 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.338236 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.338247 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.338257 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65527bca-7849-47c3-ad54-11916b724542-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.623327 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e315c04-30b5-402f-8863-6612cb639a19","Type":"ContainerStarted","Data":"663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be"} Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.632624 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" event={"ID":"65527bca-7849-47c3-ad54-11916b724542","Type":"ContainerDied","Data":"120b79aa5084dfd88c537bcd04bd5032ef3d7786cbc9c0afd60c2e1c1db62139"} Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.632670 4773 scope.go:117] "RemoveContainer" containerID="118ac53ee6f0e2da22020764297cba6d99a16fca95123276676f78eae861883d" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.632767 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-k4dvm" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.645639 4773 generic.go:334] "Generic (PLEG): container finished" podID="59e2db42-e430-4607-82c7-1676b3906dbb" containerID="570402c03a8d49ec139c4f8eac23519a8e3ebc60e6cf68b010d337431ff29fa6" exitCode=0 Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.647884 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94x76" event={"ID":"59e2db42-e430-4607-82c7-1676b3906dbb","Type":"ContainerDied","Data":"570402c03a8d49ec139c4f8eac23519a8e3ebc60e6cf68b010d337431ff29fa6"} Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.720379 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-k4dvm"] Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.732029 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-k4dvm"] Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.750859 4773 scope.go:117] "RemoveContainer" containerID="98ab4bf86fb27e5ce91e6ac8cadda15ae4e9ae56de3cf88dd4f17e0378bf7690" Jan 21 15:50:02 crc kubenswrapper[4773]: I0121 15:50:02.768995 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v87vc"] Jan 21 15:50:02 crc kubenswrapper[4773]: W0121 15:50:02.788178 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13ca882b_0344_4227_bcb8_92d3845c0385.slice/crio-d37ac3f818457675b79f2f74cbe13e7103ac7fdb7a48177acb17bf369acfe6a9 WatchSource:0}: Error finding container d37ac3f818457675b79f2f74cbe13e7103ac7fdb7a48177acb17bf369acfe6a9: Status 404 returned error can't find the container with id d37ac3f818457675b79f2f74cbe13e7103ac7fdb7a48177acb17bf369acfe6a9 Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.161109 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.268426 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-config-data\") pod \"da351801-59dd-48f7-a0cd-4b5466057278\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.268570 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9ktv\" (UniqueName: \"kubernetes.io/projected/da351801-59dd-48f7-a0cd-4b5466057278-kube-api-access-b9ktv\") pod \"da351801-59dd-48f7-a0cd-4b5466057278\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.268589 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-combined-ca-bundle\") pod \"da351801-59dd-48f7-a0cd-4b5466057278\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.268721 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-scripts\") pod \"da351801-59dd-48f7-a0cd-4b5466057278\" (UID: \"da351801-59dd-48f7-a0cd-4b5466057278\") " Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.281279 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da351801-59dd-48f7-a0cd-4b5466057278-kube-api-access-b9ktv" (OuterVolumeSpecName: "kube-api-access-b9ktv") pod "da351801-59dd-48f7-a0cd-4b5466057278" (UID: "da351801-59dd-48f7-a0cd-4b5466057278"). InnerVolumeSpecName "kube-api-access-b9ktv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.285945 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-scripts" (OuterVolumeSpecName: "scripts") pod "da351801-59dd-48f7-a0cd-4b5466057278" (UID: "da351801-59dd-48f7-a0cd-4b5466057278"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.310344 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da351801-59dd-48f7-a0cd-4b5466057278" (UID: "da351801-59dd-48f7-a0cd-4b5466057278"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.323838 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-config-data" (OuterVolumeSpecName: "config-data") pod "da351801-59dd-48f7-a0cd-4b5466057278" (UID: "da351801-59dd-48f7-a0cd-4b5466057278"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.374091 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.374130 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.374141 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9ktv\" (UniqueName: \"kubernetes.io/projected/da351801-59dd-48f7-a0cd-4b5466057278-kube-api-access-b9ktv\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.374150 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da351801-59dd-48f7-a0cd-4b5466057278-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.407514 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65527bca-7849-47c3-ad54-11916b724542" path="/var/lib/kubelet/pods/65527bca-7849-47c3-ad54-11916b724542/volumes" Jan 21 15:50:03 crc kubenswrapper[4773]: E0121 15:50:03.584953 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13ca882b_0344_4227_bcb8_92d3845c0385.slice/crio-conmon-ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.663061 4773 generic.go:334] "Generic (PLEG): container finished" podID="13ca882b-0344-4227-bcb8-92d3845c0385" containerID="ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2" exitCode=0 Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.663252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v87vc" event={"ID":"13ca882b-0344-4227-bcb8-92d3845c0385","Type":"ContainerDied","Data":"ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2"} Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.664508 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v87vc" event={"ID":"13ca882b-0344-4227-bcb8-92d3845c0385","Type":"ContainerStarted","Data":"d37ac3f818457675b79f2f74cbe13e7103ac7fdb7a48177acb17bf369acfe6a9"} Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.672732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xhr58" event={"ID":"da351801-59dd-48f7-a0cd-4b5466057278","Type":"ContainerDied","Data":"4077ce1810e5ff143b9d077878c921b803413c25ef7fe0144ab12869ca44adbe"} Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.672762 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4077ce1810e5ff143b9d077878c921b803413c25ef7fe0144ab12869ca44adbe" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.672820 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xhr58" Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.787184 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.787618 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerName="nova-api-api" containerID="cri-o://3ccb5e3e26b1edc575e82b461d928f2e95810e0a17aedc5bd36ad1cbeeef3669" gracePeriod=30 Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.787565 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerName="nova-api-log" containerID="cri-o://99e3d0bde5724a4d3676536f67cb2ad6aeccc2739adf146d699b60ebcd16f59f" gracePeriod=30 Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.817886 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.818163 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8cee6238-182b-4467-95ef-95cc4fbf5423" containerName="nova-scheduler-scheduler" containerID="cri-o://061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e" gracePeriod=30 Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.832125 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.832346 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" containerName="nova-metadata-log" containerID="cri-o://cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40" gracePeriod=30 Jan 21 15:50:03 crc kubenswrapper[4773]: I0121 15:50:03.832821 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" containerName="nova-metadata-metadata" containerID="cri-o://354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b" gracePeriod=30 Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.647279 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.689842 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94x76" event={"ID":"59e2db42-e430-4607-82c7-1676b3906dbb","Type":"ContainerStarted","Data":"76fdfa3c95bdb66222c7567fcddb022bf034da0d8f97bd2ba494ad8738daeb5b"} Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.696061 4773 generic.go:334] "Generic (PLEG): container finished" podID="815b33e5-5403-49d5-941d-8dc85c57a336" containerID="078c10f544cb57265747dcccf59de4ec4af518ed1875b8edf3aa3cce8a6c25fb" exitCode=0 Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.696242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-djq9w" event={"ID":"815b33e5-5403-49d5-941d-8dc85c57a336","Type":"ContainerDied","Data":"078c10f544cb57265747dcccf59de4ec4af518ed1875b8edf3aa3cce8a6c25fb"} Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.697891 4773 generic.go:334] "Generic (PLEG): container finished" podID="14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" containerID="354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b" exitCode=0 Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.697987 4773 generic.go:334] "Generic (PLEG): container finished" podID="14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" containerID="cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40" exitCode=143 Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.698080 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c","Type":"ContainerDied","Data":"354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b"} Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.698151 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c","Type":"ContainerDied","Data":"cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40"} Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.698212 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c","Type":"ContainerDied","Data":"9d08957cbb274321a658926beef3de02a37ae6f19b0a0ee249cca81170db99bc"} Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.698277 4773 scope.go:117] "RemoveContainer" containerID="354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.698444 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.702343 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e315c04-30b5-402f-8863-6612cb639a19","Type":"ContainerStarted","Data":"45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489"} Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.711277 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-94x76" podStartSLOduration=3.1175009989999998 podStartE2EDuration="6.711259942s" podCreationTimestamp="2026-01-21 15:49:58 +0000 UTC" firstStartedPulling="2026-01-21 15:50:00.529998223 +0000 UTC m=+1565.454487845" lastFinishedPulling="2026-01-21 15:50:04.123757166 +0000 UTC m=+1569.048246788" observedRunningTime="2026-01-21 15:50:04.708774354 +0000 UTC m=+1569.633263996" watchObservedRunningTime="2026-01-21 15:50:04.711259942 +0000 UTC m=+1569.635749554" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.720115 4773 generic.go:334] "Generic (PLEG): container finished" podID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerID="99e3d0bde5724a4d3676536f67cb2ad6aeccc2739adf146d699b60ebcd16f59f" exitCode=143 Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.720408 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab9b66a6-8df5-445a-8f37-979886a1c43b","Type":"ContainerDied","Data":"99e3d0bde5724a4d3676536f67cb2ad6aeccc2739adf146d699b60ebcd16f59f"} Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.726982 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-logs\") pod \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.727088 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-config-data\") pod \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.727261 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-combined-ca-bundle\") pod \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.727302 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-nova-metadata-tls-certs\") pod \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.727329 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq8hf\" (UniqueName: \"kubernetes.io/projected/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-kube-api-access-zq8hf\") pod \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\" (UID: \"14c52f5e-c50f-4fd2-bd4a-d8b664b6919c\") " Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.732033 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-logs" (OuterVolumeSpecName: "logs") pod "14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" (UID: "14c52f5e-c50f-4fd2-bd4a-d8b664b6919c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.733219 4773 scope.go:117] "RemoveContainer" containerID="cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.735048 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-kube-api-access-zq8hf" (OuterVolumeSpecName: "kube-api-access-zq8hf") pod "14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" (UID: "14c52f5e-c50f-4fd2-bd4a-d8b664b6919c"). InnerVolumeSpecName "kube-api-access-zq8hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.829828 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq8hf\" (UniqueName: \"kubernetes.io/projected/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-kube-api-access-zq8hf\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.829861 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.852271 4773 scope.go:117] "RemoveContainer" containerID="354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b" Jan 21 15:50:04 crc kubenswrapper[4773]: E0121 15:50:04.852747 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b\": container with ID starting with 354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b not found: ID does not exist" containerID="354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.852801 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b"} err="failed to get container status \"354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b\": rpc error: code = NotFound desc = could not find container \"354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b\": container with ID starting with 354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b not found: ID does not exist" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.852837 4773 scope.go:117] "RemoveContainer" containerID="cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40" Jan 21 15:50:04 crc kubenswrapper[4773]: E0121 15:50:04.854037 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40\": container with ID starting with cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40 not found: ID does not exist" containerID="cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.854064 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40"} err="failed to get container status \"cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40\": rpc error: code = NotFound desc = could not find container \"cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40\": container with ID starting with cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40 not found: ID does not exist" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.854080 4773 scope.go:117] "RemoveContainer" containerID="354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.854391 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b"} err="failed to get container status \"354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b\": rpc error: code = NotFound desc = could not find container \"354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b\": container with ID starting with 354f05a9f0b9573906baaead7c57db21fce4acf6f5283099eaa03db4a3db237b not found: ID does not exist" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.854413 4773 scope.go:117] "RemoveContainer" containerID="cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.854626 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40"} err="failed to get container status \"cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40\": rpc error: code = NotFound desc = could not find container \"cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40\": container with ID starting with cfe24b13cdf01c71aeedb6c5880debc47c7bd64b92a594abc03fca95c512fc40 not found: ID does not exist" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.869139 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" (UID: "14c52f5e-c50f-4fd2-bd4a-d8b664b6919c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.869179 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-config-data" (OuterVolumeSpecName: "config-data") pod "14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" (UID: "14c52f5e-c50f-4fd2-bd4a-d8b664b6919c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.872934 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" (UID: "14c52f5e-c50f-4fd2-bd4a-d8b664b6919c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.931320 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.931361 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:04 crc kubenswrapper[4773]: I0121 15:50:04.931374 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.042243 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.055156 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.071627 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:05 crc kubenswrapper[4773]: E0121 15:50:05.072836 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da351801-59dd-48f7-a0cd-4b5466057278" containerName="nova-manage" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.073002 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="da351801-59dd-48f7-a0cd-4b5466057278" containerName="nova-manage" Jan 21 15:50:05 crc kubenswrapper[4773]: E0121 15:50:05.073183 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65527bca-7849-47c3-ad54-11916b724542" containerName="init" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.073390 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="65527bca-7849-47c3-ad54-11916b724542" containerName="init" Jan 21 15:50:05 crc kubenswrapper[4773]: E0121 15:50:05.073511 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" containerName="nova-metadata-log" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.074413 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" containerName="nova-metadata-log" Jan 21 15:50:05 crc kubenswrapper[4773]: E0121 15:50:05.074595 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" containerName="nova-metadata-metadata" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.074726 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" containerName="nova-metadata-metadata" Jan 21 15:50:05 crc kubenswrapper[4773]: E0121 15:50:05.074814 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65527bca-7849-47c3-ad54-11916b724542" containerName="dnsmasq-dns" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.074896 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="65527bca-7849-47c3-ad54-11916b724542" containerName="dnsmasq-dns" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.075221 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" containerName="nova-metadata-log" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.075322 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" containerName="nova-metadata-metadata" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.075406 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="65527bca-7849-47c3-ad54-11916b724542" containerName="dnsmasq-dns" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.075499 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="da351801-59dd-48f7-a0cd-4b5466057278" containerName="nova-manage" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.077111 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.086293 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.088731 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.094507 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.135017 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g957t\" (UniqueName: \"kubernetes.io/projected/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-kube-api-access-g957t\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.135390 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-logs\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.135528 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-config-data\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.135767 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.136070 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.237560 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g957t\" (UniqueName: \"kubernetes.io/projected/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-kube-api-access-g957t\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.237783 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-logs\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.237916 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-config-data\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.238071 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.238277 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.238478 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-logs\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.243490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.243588 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-config-data\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.244195 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.259333 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g957t\" (UniqueName: \"kubernetes.io/projected/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-kube-api-access-g957t\") pod \"nova-metadata-0\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " pod="openstack/nova-metadata-0" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.401050 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14c52f5e-c50f-4fd2-bd4a-d8b664b6919c" path="/var/lib/kubelet/pods/14c52f5e-c50f-4fd2-bd4a-d8b664b6919c/volumes" Jan 21 15:50:05 crc kubenswrapper[4773]: I0121 15:50:05.429321 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:50:11 crc kubenswrapper[4773]: E0121 15:50:06.046617 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:50:11 crc kubenswrapper[4773]: E0121 15:50:06.048667 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:50:11 crc kubenswrapper[4773]: E0121 15:50:06.049985 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:50:11 crc kubenswrapper[4773]: E0121 15:50:06.050055 4773 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8cee6238-182b-4467-95ef-95cc4fbf5423" containerName="nova-scheduler-scheduler" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:08.764779 4773 generic.go:334] "Generic (PLEG): container finished" podID="8cee6238-182b-4467-95ef-95cc4fbf5423" containerID="061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e" exitCode=0 Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:08.764880 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8cee6238-182b-4467-95ef-95cc4fbf5423","Type":"ContainerDied","Data":"061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e"} Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:08.903095 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-94x76" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:08.903138 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-94x76" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:08.948789 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-94x76" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.416957 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9k9q5"] Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.420012 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.427017 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k9q5"] Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.524685 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-utilities\") pod \"redhat-marketplace-9k9q5\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.524877 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-catalog-content\") pod \"redhat-marketplace-9k9q5\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.524926 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5bxg\" (UniqueName: \"kubernetes.io/projected/0137586a-4c91-4bba-8613-f628d10da315-kube-api-access-g5bxg\") pod \"redhat-marketplace-9k9q5\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.626557 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-catalog-content\") pod \"redhat-marketplace-9k9q5\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.626642 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5bxg\" (UniqueName: \"kubernetes.io/projected/0137586a-4c91-4bba-8613-f628d10da315-kube-api-access-g5bxg\") pod \"redhat-marketplace-9k9q5\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.626733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-utilities\") pod \"redhat-marketplace-9k9q5\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.627484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-catalog-content\") pod \"redhat-marketplace-9k9q5\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.627514 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-utilities\") pod \"redhat-marketplace-9k9q5\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.651826 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5bxg\" (UniqueName: \"kubernetes.io/projected/0137586a-4c91-4bba-8613-f628d10da315-kube-api-access-g5bxg\") pod \"redhat-marketplace-9k9q5\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.740754 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:09.829042 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-94x76" Jan 21 15:50:11 crc kubenswrapper[4773]: E0121 15:50:11.046463 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e is running failed: container process not found" containerID="061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:50:11 crc kubenswrapper[4773]: E0121 15:50:11.048054 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e is running failed: container process not found" containerID="061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:50:11 crc kubenswrapper[4773]: E0121 15:50:11.048857 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e is running failed: container process not found" containerID="061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:50:11 crc kubenswrapper[4773]: E0121 15:50:11.048906 4773 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8cee6238-182b-4467-95ef-95cc4fbf5423" containerName="nova-scheduler-scheduler" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:11.201821 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94x76"] Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:11.817953 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v87vc" event={"ID":"13ca882b-0344-4227-bcb8-92d3845c0385","Type":"ContainerStarted","Data":"9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109"} Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:11.824635 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e315c04-30b5-402f-8863-6612cb639a19","Type":"ContainerStarted","Data":"9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9"} Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:11.825459 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:11.827640 4773 generic.go:334] "Generic (PLEG): container finished" podID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerID="3ccb5e3e26b1edc575e82b461d928f2e95810e0a17aedc5bd36ad1cbeeef3669" exitCode=0 Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:11.827678 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab9b66a6-8df5-445a-8f37-979886a1c43b","Type":"ContainerDied","Data":"3ccb5e3e26b1edc575e82b461d928f2e95810e0a17aedc5bd36ad1cbeeef3669"} Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:11.828000 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-94x76" podUID="59e2db42-e430-4607-82c7-1676b3906dbb" containerName="registry-server" containerID="cri-o://76fdfa3c95bdb66222c7567fcddb022bf034da0d8f97bd2ba494ad8738daeb5b" gracePeriod=2 Jan 21 15:50:11 crc kubenswrapper[4773]: I0121 15:50:11.890859 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.062354626 podStartE2EDuration="13.890836893s" podCreationTimestamp="2026-01-21 15:49:58 +0000 UTC" firstStartedPulling="2026-01-21 15:49:59.60207732 +0000 UTC m=+1564.526566942" lastFinishedPulling="2026-01-21 15:50:11.430559587 +0000 UTC m=+1576.355049209" observedRunningTime="2026-01-21 15:50:11.869682206 +0000 UTC m=+1576.794171848" watchObservedRunningTime="2026-01-21 15:50:11.890836893 +0000 UTC m=+1576.815326515" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.123339 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.451154 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.534553 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pkbm\" (UniqueName: \"kubernetes.io/projected/8cee6238-182b-4467-95ef-95cc4fbf5423-kube-api-access-6pkbm\") pod \"8cee6238-182b-4467-95ef-95cc4fbf5423\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.534753 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-config-data\") pod \"8cee6238-182b-4467-95ef-95cc4fbf5423\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.534861 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-combined-ca-bundle\") pod \"8cee6238-182b-4467-95ef-95cc4fbf5423\" (UID: \"8cee6238-182b-4467-95ef-95cc4fbf5423\") " Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.551376 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cee6238-182b-4467-95ef-95cc4fbf5423-kube-api-access-6pkbm" (OuterVolumeSpecName: "kube-api-access-6pkbm") pod "8cee6238-182b-4467-95ef-95cc4fbf5423" (UID: "8cee6238-182b-4467-95ef-95cc4fbf5423"). InnerVolumeSpecName "kube-api-access-6pkbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.580507 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cee6238-182b-4467-95ef-95cc4fbf5423" (UID: "8cee6238-182b-4467-95ef-95cc4fbf5423"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.594259 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-config-data" (OuterVolumeSpecName: "config-data") pod "8cee6238-182b-4467-95ef-95cc4fbf5423" (UID: "8cee6238-182b-4467-95ef-95cc4fbf5423"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.643894 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.644202 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cee6238-182b-4467-95ef-95cc4fbf5423-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.644215 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pkbm\" (UniqueName: \"kubernetes.io/projected/8cee6238-182b-4467-95ef-95cc4fbf5423-kube-api-access-6pkbm\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.679835 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.749765 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-config-data\") pod \"815b33e5-5403-49d5-941d-8dc85c57a336\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.749822 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdhbv\" (UniqueName: \"kubernetes.io/projected/815b33e5-5403-49d5-941d-8dc85c57a336-kube-api-access-zdhbv\") pod \"815b33e5-5403-49d5-941d-8dc85c57a336\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.749887 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-combined-ca-bundle\") pod \"815b33e5-5403-49d5-941d-8dc85c57a336\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.750028 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-scripts\") pod \"815b33e5-5403-49d5-941d-8dc85c57a336\" (UID: \"815b33e5-5403-49d5-941d-8dc85c57a336\") " Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.753084 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k9q5"] Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.757328 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815b33e5-5403-49d5-941d-8dc85c57a336-kube-api-access-zdhbv" (OuterVolumeSpecName: "kube-api-access-zdhbv") pod "815b33e5-5403-49d5-941d-8dc85c57a336" (UID: "815b33e5-5403-49d5-941d-8dc85c57a336"). InnerVolumeSpecName "kube-api-access-zdhbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.764053 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-scripts" (OuterVolumeSpecName: "scripts") pod "815b33e5-5403-49d5-941d-8dc85c57a336" (UID: "815b33e5-5403-49d5-941d-8dc85c57a336"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:12 crc kubenswrapper[4773]: W0121 15:50:12.769945 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0137586a_4c91_4bba_8613_f628d10da315.slice/crio-4e40894a416936f826177af72d5f783fb71fb815ab05b079327f02222fb51f92 WatchSource:0}: Error finding container 4e40894a416936f826177af72d5f783fb71fb815ab05b079327f02222fb51f92: Status 404 returned error can't find the container with id 4e40894a416936f826177af72d5f783fb71fb815ab05b079327f02222fb51f92 Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.824683 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "815b33e5-5403-49d5-941d-8dc85c57a336" (UID: "815b33e5-5403-49d5-941d-8dc85c57a336"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.830585 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-config-data" (OuterVolumeSpecName: "config-data") pod "815b33e5-5403-49d5-941d-8dc85c57a336" (UID: "815b33e5-5403-49d5-941d-8dc85c57a336"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.851882 4773 generic.go:334] "Generic (PLEG): container finished" podID="59e2db42-e430-4607-82c7-1676b3906dbb" containerID="76fdfa3c95bdb66222c7567fcddb022bf034da0d8f97bd2ba494ad8738daeb5b" exitCode=0 Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.851962 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94x76" event={"ID":"59e2db42-e430-4607-82c7-1676b3906dbb","Type":"ContainerDied","Data":"76fdfa3c95bdb66222c7567fcddb022bf034da0d8f97bd2ba494ad8738daeb5b"} Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.853811 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.853836 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.853846 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdhbv\" (UniqueName: \"kubernetes.io/projected/815b33e5-5403-49d5-941d-8dc85c57a336-kube-api-access-zdhbv\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.853880 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815b33e5-5403-49d5-941d-8dc85c57a336-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.866271 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-djq9w" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.866600 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-djq9w" event={"ID":"815b33e5-5403-49d5-941d-8dc85c57a336","Type":"ContainerDied","Data":"7322d9baf703d27b6930c53c14f481f9075f8fe600f2a1af98b5bec8886116f0"} Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.866668 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7322d9baf703d27b6930c53c14f481f9075f8fe600f2a1af98b5bec8886116f0" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.872201 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k9q5" event={"ID":"0137586a-4c91-4bba-8613-f628d10da315","Type":"ContainerStarted","Data":"4e40894a416936f826177af72d5f783fb71fb815ab05b079327f02222fb51f92"} Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.875999 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8","Type":"ContainerStarted","Data":"d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e"} Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.876034 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8","Type":"ContainerStarted","Data":"83c1702257a63f2cb3116c3ece74b722d357d0745472a2b73558185b7831f696"} Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.881441 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8cee6238-182b-4467-95ef-95cc4fbf5423","Type":"ContainerDied","Data":"bc71773b1a9a320da0549a4c0935ebf4c50970267a56b2b72d56af93abf3062b"} Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.881500 4773 scope.go:117] "RemoveContainer" containerID="061cc4c20162136c8482aa81f5697d3dca33443fae964f9381d3f3f5fef9095e" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.881737 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.892443 4773 generic.go:334] "Generic (PLEG): container finished" podID="13ca882b-0344-4227-bcb8-92d3845c0385" containerID="9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109" exitCode=0 Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.893483 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v87vc" event={"ID":"13ca882b-0344-4227-bcb8-92d3845c0385","Type":"ContainerDied","Data":"9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109"} Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.933799 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=7.933782824 podStartE2EDuration="7.933782824s" podCreationTimestamp="2026-01-21 15:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:12.897059061 +0000 UTC m=+1577.821548683" watchObservedRunningTime="2026-01-21 15:50:12.933782824 +0000 UTC m=+1577.858272446" Jan 21 15:50:12 crc kubenswrapper[4773]: I0121 15:50:12.979340 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.025937 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94x76" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.051877 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.062551 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmg8f\" (UniqueName: \"kubernetes.io/projected/ab9b66a6-8df5-445a-8f37-979886a1c43b-kube-api-access-hmg8f\") pod \"ab9b66a6-8df5-445a-8f37-979886a1c43b\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.062827 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-combined-ca-bundle\") pod \"ab9b66a6-8df5-445a-8f37-979886a1c43b\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.063192 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-config-data\") pod \"ab9b66a6-8df5-445a-8f37-979886a1c43b\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.063237 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9b66a6-8df5-445a-8f37-979886a1c43b-logs\") pod \"ab9b66a6-8df5-445a-8f37-979886a1c43b\" (UID: \"ab9b66a6-8df5-445a-8f37-979886a1c43b\") " Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.065381 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab9b66a6-8df5-445a-8f37-979886a1c43b-logs" (OuterVolumeSpecName: "logs") pod "ab9b66a6-8df5-445a-8f37-979886a1c43b" (UID: "ab9b66a6-8df5-445a-8f37-979886a1c43b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.080783 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9b66a6-8df5-445a-8f37-979886a1c43b-kube-api-access-hmg8f" (OuterVolumeSpecName: "kube-api-access-hmg8f") pod "ab9b66a6-8df5-445a-8f37-979886a1c43b" (UID: "ab9b66a6-8df5-445a-8f37-979886a1c43b"). InnerVolumeSpecName "kube-api-access-hmg8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.090217 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.104740 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:13 crc kubenswrapper[4773]: E0121 15:50:13.105188 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cee6238-182b-4467-95ef-95cc4fbf5423" containerName="nova-scheduler-scheduler" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105207 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cee6238-182b-4467-95ef-95cc4fbf5423" containerName="nova-scheduler-scheduler" Jan 21 15:50:13 crc kubenswrapper[4773]: E0121 15:50:13.105227 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815b33e5-5403-49d5-941d-8dc85c57a336" containerName="nova-cell1-conductor-db-sync" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105233 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="815b33e5-5403-49d5-941d-8dc85c57a336" containerName="nova-cell1-conductor-db-sync" Jan 21 15:50:13 crc kubenswrapper[4773]: E0121 15:50:13.105248 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerName="nova-api-api" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105254 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerName="nova-api-api" Jan 21 15:50:13 crc kubenswrapper[4773]: E0121 15:50:13.105263 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerName="nova-api-log" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105269 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerName="nova-api-log" Jan 21 15:50:13 crc kubenswrapper[4773]: E0121 15:50:13.105279 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e2db42-e430-4607-82c7-1676b3906dbb" containerName="registry-server" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105285 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e2db42-e430-4607-82c7-1676b3906dbb" containerName="registry-server" Jan 21 15:50:13 crc kubenswrapper[4773]: E0121 15:50:13.105296 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e2db42-e430-4607-82c7-1676b3906dbb" containerName="extract-utilities" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105302 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e2db42-e430-4607-82c7-1676b3906dbb" containerName="extract-utilities" Jan 21 15:50:13 crc kubenswrapper[4773]: E0121 15:50:13.105320 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e2db42-e430-4607-82c7-1676b3906dbb" containerName="extract-content" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105326 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e2db42-e430-4607-82c7-1676b3906dbb" containerName="extract-content" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105523 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerName="nova-api-api" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105547 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cee6238-182b-4467-95ef-95cc4fbf5423" containerName="nova-scheduler-scheduler" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105566 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e2db42-e430-4607-82c7-1676b3906dbb" containerName="registry-server" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105584 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="815b33e5-5403-49d5-941d-8dc85c57a336" containerName="nova-cell1-conductor-db-sync" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.105594 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9b66a6-8df5-445a-8f37-979886a1c43b" containerName="nova-api-log" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.106624 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.114363 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.114837 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab9b66a6-8df5-445a-8f37-979886a1c43b" (UID: "ab9b66a6-8df5-445a-8f37-979886a1c43b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.125462 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-config-data" (OuterVolumeSpecName: "config-data") pod "ab9b66a6-8df5-445a-8f37-979886a1c43b" (UID: "ab9b66a6-8df5-445a-8f37-979886a1c43b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.131772 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.166243 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-utilities\") pod \"59e2db42-e430-4607-82c7-1676b3906dbb\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.166331 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-catalog-content\") pod \"59e2db42-e430-4607-82c7-1676b3906dbb\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.166518 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7nc9\" (UniqueName: \"kubernetes.io/projected/59e2db42-e430-4607-82c7-1676b3906dbb-kube-api-access-q7nc9\") pod \"59e2db42-e430-4607-82c7-1676b3906dbb\" (UID: \"59e2db42-e430-4607-82c7-1676b3906dbb\") " Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.167090 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.167125 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9b66a6-8df5-445a-8f37-979886a1c43b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.167076 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-utilities" (OuterVolumeSpecName: "utilities") pod "59e2db42-e430-4607-82c7-1676b3906dbb" (UID: "59e2db42-e430-4607-82c7-1676b3906dbb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.167139 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9b66a6-8df5-445a-8f37-979886a1c43b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.167183 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmg8f\" (UniqueName: \"kubernetes.io/projected/ab9b66a6-8df5-445a-8f37-979886a1c43b-kube-api-access-hmg8f\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.170496 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e2db42-e430-4607-82c7-1676b3906dbb-kube-api-access-q7nc9" (OuterVolumeSpecName: "kube-api-access-q7nc9") pod "59e2db42-e430-4607-82c7-1676b3906dbb" (UID: "59e2db42-e430-4607-82c7-1676b3906dbb"). InnerVolumeSpecName "kube-api-access-q7nc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.221364 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59e2db42-e430-4607-82c7-1676b3906dbb" (UID: "59e2db42-e430-4607-82c7-1676b3906dbb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.269225 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-config-data\") pod \"nova-scheduler-0\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.269681 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.270019 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmxz6\" (UniqueName: \"kubernetes.io/projected/986374f1-375b-4346-bde4-7db28c6f1f4e-kube-api-access-wmxz6\") pod \"nova-scheduler-0\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.270169 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.270224 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e2db42-e430-4607-82c7-1676b3906dbb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.270240 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7nc9\" (UniqueName: \"kubernetes.io/projected/59e2db42-e430-4607-82c7-1676b3906dbb-kube-api-access-q7nc9\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.372680 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmxz6\" (UniqueName: \"kubernetes.io/projected/986374f1-375b-4346-bde4-7db28c6f1f4e-kube-api-access-wmxz6\") pod \"nova-scheduler-0\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.372769 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-config-data\") pod \"nova-scheduler-0\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.372914 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.376402 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.376544 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-config-data\") pod \"nova-scheduler-0\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.392591 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmxz6\" (UniqueName: \"kubernetes.io/projected/986374f1-375b-4346-bde4-7db28c6f1f4e-kube-api-access-wmxz6\") pod \"nova-scheduler-0\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.403594 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cee6238-182b-4467-95ef-95cc4fbf5423" path="/var/lib/kubelet/pods/8cee6238-182b-4467-95ef-95cc4fbf5423/volumes" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.497105 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.801295 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.804480 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.809056 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.831562 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.883516 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8985a73a-071c-41cd-9828-d74a631c7606-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8985a73a-071c-41cd-9828-d74a631c7606\") " pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.883573 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8985a73a-071c-41cd-9828-d74a631c7606-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8985a73a-071c-41cd-9828-d74a631c7606\") " pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.883755 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nndt7\" (UniqueName: \"kubernetes.io/projected/8985a73a-071c-41cd-9828-d74a631c7606-kube-api-access-nndt7\") pod \"nova-cell1-conductor-0\" (UID: \"8985a73a-071c-41cd-9828-d74a631c7606\") " pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.908749 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94x76" event={"ID":"59e2db42-e430-4607-82c7-1676b3906dbb","Type":"ContainerDied","Data":"8839c5620e32f049bdc764591201b63b723a2ef3c5bd42921e21f4fec26a44e2"} Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.908821 4773 scope.go:117] "RemoveContainer" containerID="76fdfa3c95bdb66222c7567fcddb022bf034da0d8f97bd2ba494ad8738daeb5b" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.908968 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94x76" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.912607 4773 generic.go:334] "Generic (PLEG): container finished" podID="0137586a-4c91-4bba-8613-f628d10da315" containerID="2ae092d19b5052f80f7438321f932266f021a276caa1d9413d7d60ba25aa43c8" exitCode=0 Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.912672 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k9q5" event={"ID":"0137586a-4c91-4bba-8613-f628d10da315","Type":"ContainerDied","Data":"2ae092d19b5052f80f7438321f932266f021a276caa1d9413d7d60ba25aa43c8"} Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.914508 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8","Type":"ContainerStarted","Data":"ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0"} Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.937524 4773 scope.go:117] "RemoveContainer" containerID="570402c03a8d49ec139c4f8eac23519a8e3ebc60e6cf68b010d337431ff29fa6" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.939053 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab9b66a6-8df5-445a-8f37-979886a1c43b","Type":"ContainerDied","Data":"d3f046162a06337f799382669f569f103421a627e20ab5f89905c016c50937df"} Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.939142 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.973593 4773 scope.go:117] "RemoveContainer" containerID="b93cb23170d82852e45d9962750482eb9b55096c7cc7ef80573d94cfbf0156e7" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.986081 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nndt7\" (UniqueName: \"kubernetes.io/projected/8985a73a-071c-41cd-9828-d74a631c7606-kube-api-access-nndt7\") pod \"nova-cell1-conductor-0\" (UID: \"8985a73a-071c-41cd-9828-d74a631c7606\") " pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.986188 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8985a73a-071c-41cd-9828-d74a631c7606-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8985a73a-071c-41cd-9828-d74a631c7606\") " pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.986214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8985a73a-071c-41cd-9828-d74a631c7606-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8985a73a-071c-41cd-9828-d74a631c7606\") " pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:13 crc kubenswrapper[4773]: I0121 15:50:13.986927 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94x76"] Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.010359 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8985a73a-071c-41cd-9828-d74a631c7606-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8985a73a-071c-41cd-9828-d74a631c7606\") " pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.014762 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nndt7\" (UniqueName: \"kubernetes.io/projected/8985a73a-071c-41cd-9828-d74a631c7606-kube-api-access-nndt7\") pod \"nova-cell1-conductor-0\" (UID: \"8985a73a-071c-41cd-9828-d74a631c7606\") " pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.014896 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8985a73a-071c-41cd-9828-d74a631c7606-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8985a73a-071c-41cd-9828-d74a631c7606\") " pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.021906 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-94x76"] Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.039759 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.054829 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.071249 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.073333 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.077049 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.093972 4773 scope.go:117] "RemoveContainer" containerID="3ccb5e3e26b1edc575e82b461d928f2e95810e0a17aedc5bd36ad1cbeeef3669" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.094220 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.104134 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:14 crc kubenswrapper[4773]: W0121 15:50:14.105951 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod986374f1_375b_4346_bde4_7db28c6f1f4e.slice/crio-82ffcd42a446afc4c1f0c889a0cd3188c129683d0509b8ca946f9a2d197bdd3a WatchSource:0}: Error finding container 82ffcd42a446afc4c1f0c889a0cd3188c129683d0509b8ca946f9a2d197bdd3a: Status 404 returned error can't find the container with id 82ffcd42a446afc4c1f0c889a0cd3188c129683d0509b8ca946f9a2d197bdd3a Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.128779 4773 scope.go:117] "RemoveContainer" containerID="99e3d0bde5724a4d3676536f67cb2ad6aeccc2739adf146d699b60ebcd16f59f" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.143792 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.190034 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdmxf\" (UniqueName: \"kubernetes.io/projected/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-kube-api-access-jdmxf\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.190140 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-logs\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.190200 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.190298 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-config-data\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.298323 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdmxf\" (UniqueName: \"kubernetes.io/projected/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-kube-api-access-jdmxf\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.298482 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-logs\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.298557 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.298704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-config-data\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.299036 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-logs\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.303855 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.310264 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-config-data\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.319317 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdmxf\" (UniqueName: \"kubernetes.io/projected/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-kube-api-access-jdmxf\") pod \"nova-api-0\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.400358 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.744234 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.956206 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"986374f1-375b-4346-bde4-7db28c6f1f4e","Type":"ContainerStarted","Data":"5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f"} Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.956249 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"986374f1-375b-4346-bde4-7db28c6f1f4e","Type":"ContainerStarted","Data":"82ffcd42a446afc4c1f0c889a0cd3188c129683d0509b8ca946f9a2d197bdd3a"} Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.964042 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.965647 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8985a73a-071c-41cd-9828-d74a631c7606","Type":"ContainerStarted","Data":"232c6262f4a89df839fa4c24ace0a28e2f2d5a9c432c74d3289c55c331519e98"} Jan 21 15:50:14 crc kubenswrapper[4773]: W0121 15:50:14.967838 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d4108e9_0cf1_45fb_9b25_6718091cb0f3.slice/crio-01028f46acc1a744be5a624165c12a0725bd0cc16b3747a078463b05a4df711d WatchSource:0}: Error finding container 01028f46acc1a744be5a624165c12a0725bd0cc16b3747a078463b05a4df711d: Status 404 returned error can't find the container with id 01028f46acc1a744be5a624165c12a0725bd0cc16b3747a078463b05a4df711d Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.980495 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v87vc" event={"ID":"13ca882b-0344-4227-bcb8-92d3845c0385","Type":"ContainerStarted","Data":"535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb"} Jan 21 15:50:14 crc kubenswrapper[4773]: I0121 15:50:14.993791 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.993764517 podStartE2EDuration="1.993764517s" podCreationTimestamp="2026-01-21 15:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:14.977684049 +0000 UTC m=+1579.902173671" watchObservedRunningTime="2026-01-21 15:50:14.993764517 +0000 UTC m=+1579.918254139" Jan 21 15:50:15 crc kubenswrapper[4773]: I0121 15:50:15.013539 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v87vc" podStartSLOduration=3.671132967 podStartE2EDuration="14.013516406s" podCreationTimestamp="2026-01-21 15:50:01 +0000 UTC" firstStartedPulling="2026-01-21 15:50:03.665807053 +0000 UTC m=+1568.590296675" lastFinishedPulling="2026-01-21 15:50:14.008190492 +0000 UTC m=+1578.932680114" observedRunningTime="2026-01-21 15:50:15.00375694 +0000 UTC m=+1579.928246572" watchObservedRunningTime="2026-01-21 15:50:15.013516406 +0000 UTC m=+1579.938006028" Jan 21 15:50:15 crc kubenswrapper[4773]: I0121 15:50:15.401443 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e2db42-e430-4607-82c7-1676b3906dbb" path="/var/lib/kubelet/pods/59e2db42-e430-4607-82c7-1676b3906dbb/volumes" Jan 21 15:50:15 crc kubenswrapper[4773]: I0121 15:50:15.403301 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9b66a6-8df5-445a-8f37-979886a1c43b" path="/var/lib/kubelet/pods/ab9b66a6-8df5-445a-8f37-979886a1c43b/volumes" Jan 21 15:50:15 crc kubenswrapper[4773]: I0121 15:50:15.430017 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 15:50:15 crc kubenswrapper[4773]: I0121 15:50:15.430086 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 15:50:15 crc kubenswrapper[4773]: I0121 15:50:15.430106 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 15:50:15 crc kubenswrapper[4773]: I0121 15:50:15.430127 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 15:50:15 crc kubenswrapper[4773]: I0121 15:50:15.992231 4773 generic.go:334] "Generic (PLEG): container finished" podID="0137586a-4c91-4bba-8613-f628d10da315" containerID="97152f7b6fffa29da327cb974813c5ff339cffdfb62f4721ef79a46adbc2de09" exitCode=0 Jan 21 15:50:15 crc kubenswrapper[4773]: I0121 15:50:15.992370 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k9q5" event={"ID":"0137586a-4c91-4bba-8613-f628d10da315","Type":"ContainerDied","Data":"97152f7b6fffa29da327cb974813c5ff339cffdfb62f4721ef79a46adbc2de09"} Jan 21 15:50:15 crc kubenswrapper[4773]: I0121 15:50:15.994778 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8985a73a-071c-41cd-9828-d74a631c7606","Type":"ContainerStarted","Data":"07f88825c51f1b1a8274b5c7f77399dcf1f5280464522cb707446f88dbbab953"} Jan 21 15:50:15 crc kubenswrapper[4773]: I0121 15:50:15.995427 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:16 crc kubenswrapper[4773]: I0121 15:50:16.005227 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d4108e9-0cf1-45fb-9b25-6718091cb0f3","Type":"ContainerStarted","Data":"6b79eadcd33a08475413160715f0d61e55bf5173812336e0cb9058ead546a5f5"} Jan 21 15:50:16 crc kubenswrapper[4773]: I0121 15:50:16.005273 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d4108e9-0cf1-45fb-9b25-6718091cb0f3","Type":"ContainerStarted","Data":"7392e4eb01cc80c060ba1fd04c53c1ab953d9c42cde45c76526288a5aa78e95b"} Jan 21 15:50:16 crc kubenswrapper[4773]: I0121 15:50:16.005283 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d4108e9-0cf1-45fb-9b25-6718091cb0f3","Type":"ContainerStarted","Data":"01028f46acc1a744be5a624165c12a0725bd0cc16b3747a078463b05a4df711d"} Jan 21 15:50:16 crc kubenswrapper[4773]: I0121 15:50:16.039656 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.039640118 podStartE2EDuration="3.039640118s" podCreationTimestamp="2026-01-21 15:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:16.03606671 +0000 UTC m=+1580.960556332" watchObservedRunningTime="2026-01-21 15:50:16.039640118 +0000 UTC m=+1580.964129740" Jan 21 15:50:16 crc kubenswrapper[4773]: I0121 15:50:16.061941 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.061922826 podStartE2EDuration="3.061922826s" podCreationTimestamp="2026-01-21 15:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:16.060955109 +0000 UTC m=+1580.985444751" watchObservedRunningTime="2026-01-21 15:50:16.061922826 +0000 UTC m=+1580.986412448" Jan 21 15:50:16 crc kubenswrapper[4773]: I0121 15:50:16.446365 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.223:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:50:16 crc kubenswrapper[4773]: I0121 15:50:16.446413 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.223:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:50:17 crc kubenswrapper[4773]: I0121 15:50:17.021670 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k9q5" event={"ID":"0137586a-4c91-4bba-8613-f628d10da315","Type":"ContainerStarted","Data":"5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f"} Jan 21 15:50:17 crc kubenswrapper[4773]: I0121 15:50:17.077218 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9k9q5" podStartSLOduration=5.511363469 podStartE2EDuration="8.077195731s" podCreationTimestamp="2026-01-21 15:50:09 +0000 UTC" firstStartedPulling="2026-01-21 15:50:13.916897972 +0000 UTC m=+1578.841387604" lastFinishedPulling="2026-01-21 15:50:16.482730244 +0000 UTC m=+1581.407219866" observedRunningTime="2026-01-21 15:50:17.06359884 +0000 UTC m=+1581.988088482" watchObservedRunningTime="2026-01-21 15:50:17.077195731 +0000 UTC m=+1582.001685353" Jan 21 15:50:18 crc kubenswrapper[4773]: I0121 15:50:18.498172 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 15:50:19 crc kubenswrapper[4773]: I0121 15:50:19.741384 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:19 crc kubenswrapper[4773]: I0121 15:50:19.741675 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:19 crc kubenswrapper[4773]: I0121 15:50:19.786575 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:21 crc kubenswrapper[4773]: I0121 15:50:21.103603 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:21 crc kubenswrapper[4773]: I0121 15:50:21.590968 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k9q5"] Jan 21 15:50:21 crc kubenswrapper[4773]: I0121 15:50:21.992411 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:21 crc kubenswrapper[4773]: I0121 15:50:21.992460 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:22 crc kubenswrapper[4773]: I0121 15:50:22.040616 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:22 crc kubenswrapper[4773]: I0121 15:50:22.114347 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.077765 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9k9q5" podUID="0137586a-4c91-4bba-8613-f628d10da315" containerName="registry-server" containerID="cri-o://5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f" gracePeriod=2 Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.498219 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.537615 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.715142 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.739235 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-utilities\") pod \"0137586a-4c91-4bba-8613-f628d10da315\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.739445 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-catalog-content\") pod \"0137586a-4c91-4bba-8613-f628d10da315\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.739645 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5bxg\" (UniqueName: \"kubernetes.io/projected/0137586a-4c91-4bba-8613-f628d10da315-kube-api-access-g5bxg\") pod \"0137586a-4c91-4bba-8613-f628d10da315\" (UID: \"0137586a-4c91-4bba-8613-f628d10da315\") " Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.740279 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-utilities" (OuterVolumeSpecName: "utilities") pod "0137586a-4c91-4bba-8613-f628d10da315" (UID: "0137586a-4c91-4bba-8613-f628d10da315"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.741298 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.753989 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0137586a-4c91-4bba-8613-f628d10da315-kube-api-access-g5bxg" (OuterVolumeSpecName: "kube-api-access-g5bxg") pod "0137586a-4c91-4bba-8613-f628d10da315" (UID: "0137586a-4c91-4bba-8613-f628d10da315"). InnerVolumeSpecName "kube-api-access-g5bxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.761881 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0137586a-4c91-4bba-8613-f628d10da315" (UID: "0137586a-4c91-4bba-8613-f628d10da315"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.843876 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0137586a-4c91-4bba-8613-f628d10da315-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:23 crc kubenswrapper[4773]: I0121 15:50:23.843908 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5bxg\" (UniqueName: \"kubernetes.io/projected/0137586a-4c91-4bba-8613-f628d10da315-kube-api-access-g5bxg\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.088451 4773 generic.go:334] "Generic (PLEG): container finished" podID="0137586a-4c91-4bba-8613-f628d10da315" containerID="5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f" exitCode=0 Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.088536 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k9q5" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.088577 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k9q5" event={"ID":"0137586a-4c91-4bba-8613-f628d10da315","Type":"ContainerDied","Data":"5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f"} Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.088790 4773 scope.go:117] "RemoveContainer" containerID="5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.089737 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k9q5" event={"ID":"0137586a-4c91-4bba-8613-f628d10da315","Type":"ContainerDied","Data":"4e40894a416936f826177af72d5f783fb71fb815ab05b079327f02222fb51f92"} Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.109969 4773 scope.go:117] "RemoveContainer" containerID="97152f7b6fffa29da327cb974813c5ff339cffdfb62f4721ef79a46adbc2de09" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.130155 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.133175 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k9q5"] Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.148208 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k9q5"] Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.161293 4773 scope.go:117] "RemoveContainer" containerID="2ae092d19b5052f80f7438321f932266f021a276caa1d9413d7d60ba25aa43c8" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.197367 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.200069 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v87vc"] Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.222375 4773 scope.go:117] "RemoveContainer" containerID="5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f" Jan 21 15:50:24 crc kubenswrapper[4773]: E0121 15:50:24.226345 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f\": container with ID starting with 5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f not found: ID does not exist" containerID="5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.226406 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f"} err="failed to get container status \"5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f\": rpc error: code = NotFound desc = could not find container \"5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f\": container with ID starting with 5b5e77cd591a2530112e149ff4736d88d158aea5f4f0edbd1062ae9468ff4e0f not found: ID does not exist" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.226452 4773 scope.go:117] "RemoveContainer" containerID="97152f7b6fffa29da327cb974813c5ff339cffdfb62f4721ef79a46adbc2de09" Jan 21 15:50:24 crc kubenswrapper[4773]: E0121 15:50:24.227101 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97152f7b6fffa29da327cb974813c5ff339cffdfb62f4721ef79a46adbc2de09\": container with ID starting with 97152f7b6fffa29da327cb974813c5ff339cffdfb62f4721ef79a46adbc2de09 not found: ID does not exist" containerID="97152f7b6fffa29da327cb974813c5ff339cffdfb62f4721ef79a46adbc2de09" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.227148 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97152f7b6fffa29da327cb974813c5ff339cffdfb62f4721ef79a46adbc2de09"} err="failed to get container status \"97152f7b6fffa29da327cb974813c5ff339cffdfb62f4721ef79a46adbc2de09\": rpc error: code = NotFound desc = could not find container \"97152f7b6fffa29da327cb974813c5ff339cffdfb62f4721ef79a46adbc2de09\": container with ID starting with 97152f7b6fffa29da327cb974813c5ff339cffdfb62f4721ef79a46adbc2de09 not found: ID does not exist" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.227175 4773 scope.go:117] "RemoveContainer" containerID="2ae092d19b5052f80f7438321f932266f021a276caa1d9413d7d60ba25aa43c8" Jan 21 15:50:24 crc kubenswrapper[4773]: E0121 15:50:24.227620 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae092d19b5052f80f7438321f932266f021a276caa1d9413d7d60ba25aa43c8\": container with ID starting with 2ae092d19b5052f80f7438321f932266f021a276caa1d9413d7d60ba25aa43c8 not found: ID does not exist" containerID="2ae092d19b5052f80f7438321f932266f021a276caa1d9413d7d60ba25aa43c8" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.227670 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae092d19b5052f80f7438321f932266f021a276caa1d9413d7d60ba25aa43c8"} err="failed to get container status \"2ae092d19b5052f80f7438321f932266f021a276caa1d9413d7d60ba25aa43c8\": rpc error: code = NotFound desc = could not find container \"2ae092d19b5052f80f7438321f932266f021a276caa1d9413d7d60ba25aa43c8\": container with ID starting with 2ae092d19b5052f80f7438321f932266f021a276caa1d9413d7d60ba25aa43c8 not found: ID does not exist" Jan 21 15:50:24 crc kubenswrapper[4773]: E0121 15:50:24.293760 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0137586a_4c91_4bba_8613_f628d10da315.slice\": RecentStats: unable to find data in memory cache]" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.402154 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 15:50:24 crc kubenswrapper[4773]: I0121 15:50:24.402220 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.104408 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v87vc" podUID="13ca882b-0344-4227-bcb8-92d3845c0385" containerName="registry-server" containerID="cri-o://535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb" gracePeriod=2 Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.205897 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.206364 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.407270 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0137586a-4c91-4bba-8613-f628d10da315" path="/var/lib/kubelet/pods/0137586a-4c91-4bba-8613-f628d10da315/volumes" Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.488512 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.227:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.488829 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.227:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.502748 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.519773 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.520149 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.702761 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.927807 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-utilities\") pod \"13ca882b-0344-4227-bcb8-92d3845c0385\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.927878 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jbq9\" (UniqueName: \"kubernetes.io/projected/13ca882b-0344-4227-bcb8-92d3845c0385-kube-api-access-2jbq9\") pod \"13ca882b-0344-4227-bcb8-92d3845c0385\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.927969 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-catalog-content\") pod \"13ca882b-0344-4227-bcb8-92d3845c0385\" (UID: \"13ca882b-0344-4227-bcb8-92d3845c0385\") " Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.949966 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13ca882b-0344-4227-bcb8-92d3845c0385-kube-api-access-2jbq9" (OuterVolumeSpecName: "kube-api-access-2jbq9") pod "13ca882b-0344-4227-bcb8-92d3845c0385" (UID: "13ca882b-0344-4227-bcb8-92d3845c0385"). InnerVolumeSpecName "kube-api-access-2jbq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:25 crc kubenswrapper[4773]: I0121 15:50:25.984403 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-utilities" (OuterVolumeSpecName: "utilities") pod "13ca882b-0344-4227-bcb8-92d3845c0385" (UID: "13ca882b-0344-4227-bcb8-92d3845c0385"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.022119 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13ca882b-0344-4227-bcb8-92d3845c0385" (UID: "13ca882b-0344-4227-bcb8-92d3845c0385"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.029873 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.029922 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jbq9\" (UniqueName: \"kubernetes.io/projected/13ca882b-0344-4227-bcb8-92d3845c0385-kube-api-access-2jbq9\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.029965 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13ca882b-0344-4227-bcb8-92d3845c0385-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.122020 4773 generic.go:334] "Generic (PLEG): container finished" podID="13ca882b-0344-4227-bcb8-92d3845c0385" containerID="535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb" exitCode=0 Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.122130 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v87vc" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.122195 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v87vc" event={"ID":"13ca882b-0344-4227-bcb8-92d3845c0385","Type":"ContainerDied","Data":"535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb"} Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.122233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v87vc" event={"ID":"13ca882b-0344-4227-bcb8-92d3845c0385","Type":"ContainerDied","Data":"d37ac3f818457675b79f2f74cbe13e7103ac7fdb7a48177acb17bf369acfe6a9"} Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.122253 4773 scope.go:117] "RemoveContainer" containerID="535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.138262 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.158503 4773 scope.go:117] "RemoveContainer" containerID="9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.223623 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v87vc"] Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.228840 4773 scope.go:117] "RemoveContainer" containerID="ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.244084 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v87vc"] Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.282246 4773 scope.go:117] "RemoveContainer" containerID="535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb" Jan 21 15:50:26 crc kubenswrapper[4773]: E0121 15:50:26.285294 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb\": container with ID starting with 535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb not found: ID does not exist" containerID="535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.285334 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb"} err="failed to get container status \"535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb\": rpc error: code = NotFound desc = could not find container \"535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb\": container with ID starting with 535fb2e0432610752c96bd867a6e7bd80f8033ced687522704d27357dff1ebcb not found: ID does not exist" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.285358 4773 scope.go:117] "RemoveContainer" containerID="9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109" Jan 21 15:50:26 crc kubenswrapper[4773]: E0121 15:50:26.288212 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109\": container with ID starting with 9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109 not found: ID does not exist" containerID="9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.288250 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109"} err="failed to get container status \"9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109\": rpc error: code = NotFound desc = could not find container \"9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109\": container with ID starting with 9846ed42457a5e57f633a869fd40d2256ce2095257ff4bfe61cc3e4ad7684109 not found: ID does not exist" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.288273 4773 scope.go:117] "RemoveContainer" containerID="ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2" Jan 21 15:50:26 crc kubenswrapper[4773]: E0121 15:50:26.289016 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2\": container with ID starting with ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2 not found: ID does not exist" containerID="ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2" Jan 21 15:50:26 crc kubenswrapper[4773]: I0121 15:50:26.289044 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2"} err="failed to get container status \"ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2\": rpc error: code = NotFound desc = could not find container \"ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2\": container with ID starting with ac6515bd6948c0d53dcb8ae4be3ec6be5827b752350c1b6920ebfe617ac149f2 not found: ID does not exist" Jan 21 15:50:27 crc kubenswrapper[4773]: I0121 15:50:27.395402 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13ca882b-0344-4227-bcb8-92d3845c0385" path="/var/lib/kubelet/pods/13ca882b-0344-4227-bcb8-92d3845c0385/volumes" Jan 21 15:50:29 crc kubenswrapper[4773]: I0121 15:50:29.043167 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 21 15:50:29 crc kubenswrapper[4773]: I0121 15:50:29.215026 4773 generic.go:334] "Generic (PLEG): container finished" podID="38800db9-ae4b-47a9-938e-0264e1bb6680" containerID="b3b55e1a46b45ab91a3b139ca594b5f79f61b6299698b7bea5009b9d5ead262b" exitCode=137 Jan 21 15:50:29 crc kubenswrapper[4773]: I0121 15:50:29.215082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38800db9-ae4b-47a9-938e-0264e1bb6680","Type":"ContainerDied","Data":"b3b55e1a46b45ab91a3b139ca594b5f79f61b6299698b7bea5009b9d5ead262b"} Jan 21 15:50:29 crc kubenswrapper[4773]: I0121 15:50:29.773541 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:29 crc kubenswrapper[4773]: I0121 15:50:29.921262 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8dhf\" (UniqueName: \"kubernetes.io/projected/38800db9-ae4b-47a9-938e-0264e1bb6680-kube-api-access-b8dhf\") pod \"38800db9-ae4b-47a9-938e-0264e1bb6680\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " Jan 21 15:50:29 crc kubenswrapper[4773]: I0121 15:50:29.921354 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-combined-ca-bundle\") pod \"38800db9-ae4b-47a9-938e-0264e1bb6680\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " Jan 21 15:50:29 crc kubenswrapper[4773]: I0121 15:50:29.921515 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-config-data\") pod \"38800db9-ae4b-47a9-938e-0264e1bb6680\" (UID: \"38800db9-ae4b-47a9-938e-0264e1bb6680\") " Jan 21 15:50:29 crc kubenswrapper[4773]: I0121 15:50:29.929020 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38800db9-ae4b-47a9-938e-0264e1bb6680-kube-api-access-b8dhf" (OuterVolumeSpecName: "kube-api-access-b8dhf") pod "38800db9-ae4b-47a9-938e-0264e1bb6680" (UID: "38800db9-ae4b-47a9-938e-0264e1bb6680"). InnerVolumeSpecName "kube-api-access-b8dhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:29 crc kubenswrapper[4773]: I0121 15:50:29.952202 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-config-data" (OuterVolumeSpecName: "config-data") pod "38800db9-ae4b-47a9-938e-0264e1bb6680" (UID: "38800db9-ae4b-47a9-938e-0264e1bb6680"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:29 crc kubenswrapper[4773]: I0121 15:50:29.955983 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38800db9-ae4b-47a9-938e-0264e1bb6680" (UID: "38800db9-ae4b-47a9-938e-0264e1bb6680"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.023880 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.023919 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8dhf\" (UniqueName: \"kubernetes.io/projected/38800db9-ae4b-47a9-938e-0264e1bb6680-kube-api-access-b8dhf\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.023935 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38800db9-ae4b-47a9-938e-0264e1bb6680-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.225606 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38800db9-ae4b-47a9-938e-0264e1bb6680","Type":"ContainerDied","Data":"379338dac5b1ffa11e17d0535f6c0bb1bdbac103c40cf49633435f3d9fd23185"} Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.225672 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.225963 4773 scope.go:117] "RemoveContainer" containerID="b3b55e1a46b45ab91a3b139ca594b5f79f61b6299698b7bea5009b9d5ead262b" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.261789 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.269380 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.281918 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 15:50:30 crc kubenswrapper[4773]: E0121 15:50:30.282411 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca882b-0344-4227-bcb8-92d3845c0385" containerName="extract-content" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.282434 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca882b-0344-4227-bcb8-92d3845c0385" containerName="extract-content" Jan 21 15:50:30 crc kubenswrapper[4773]: E0121 15:50:30.282449 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0137586a-4c91-4bba-8613-f628d10da315" containerName="registry-server" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.282455 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0137586a-4c91-4bba-8613-f628d10da315" containerName="registry-server" Jan 21 15:50:30 crc kubenswrapper[4773]: E0121 15:50:30.282464 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38800db9-ae4b-47a9-938e-0264e1bb6680" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.282471 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="38800db9-ae4b-47a9-938e-0264e1bb6680" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:50:30 crc kubenswrapper[4773]: E0121 15:50:30.282490 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca882b-0344-4227-bcb8-92d3845c0385" containerName="registry-server" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.282496 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca882b-0344-4227-bcb8-92d3845c0385" containerName="registry-server" Jan 21 15:50:30 crc kubenswrapper[4773]: E0121 15:50:30.282505 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0137586a-4c91-4bba-8613-f628d10da315" containerName="extract-content" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.282510 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0137586a-4c91-4bba-8613-f628d10da315" containerName="extract-content" Jan 21 15:50:30 crc kubenswrapper[4773]: E0121 15:50:30.282522 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca882b-0344-4227-bcb8-92d3845c0385" containerName="extract-utilities" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.282528 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca882b-0344-4227-bcb8-92d3845c0385" containerName="extract-utilities" Jan 21 15:50:30 crc kubenswrapper[4773]: E0121 15:50:30.282539 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0137586a-4c91-4bba-8613-f628d10da315" containerName="extract-utilities" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.282546 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0137586a-4c91-4bba-8613-f628d10da315" containerName="extract-utilities" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.282911 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="38800db9-ae4b-47a9-938e-0264e1bb6680" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.282937 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0137586a-4c91-4bba-8613-f628d10da315" containerName="registry-server" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.282950 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca882b-0344-4227-bcb8-92d3845c0385" containerName="registry-server" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.283752 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.288266 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.288558 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.288834 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.294973 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.433663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69k6g\" (UniqueName: \"kubernetes.io/projected/c380503c-e5d5-45c3-aeea-5997a1e792c5-kube-api-access-69k6g\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.434649 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.434979 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.435104 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.435269 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.536891 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.537500 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69k6g\" (UniqueName: \"kubernetes.io/projected/c380503c-e5d5-45c3-aeea-5997a1e792c5-kube-api-access-69k6g\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.537607 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.537822 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.537871 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.541962 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.548396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.548968 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.556660 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c380503c-e5d5-45c3-aeea-5997a1e792c5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.561191 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69k6g\" (UniqueName: \"kubernetes.io/projected/c380503c-e5d5-45c3-aeea-5997a1e792c5-kube-api-access-69k6g\") pod \"nova-cell1-novncproxy-0\" (UID: \"c380503c-e5d5-45c3-aeea-5997a1e792c5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:30 crc kubenswrapper[4773]: I0121 15:50:30.604087 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:31 crc kubenswrapper[4773]: I0121 15:50:31.080546 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 15:50:31 crc kubenswrapper[4773]: I0121 15:50:31.244121 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c380503c-e5d5-45c3-aeea-5997a1e792c5","Type":"ContainerStarted","Data":"2b21def67638ade5172936b3b15dfb7f16ceb4b99d0ca893a83199d78845d5d2"} Jan 21 15:50:31 crc kubenswrapper[4773]: I0121 15:50:31.403157 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38800db9-ae4b-47a9-938e-0264e1bb6680" path="/var/lib/kubelet/pods/38800db9-ae4b-47a9-938e-0264e1bb6680/volumes" Jan 21 15:50:32 crc kubenswrapper[4773]: I0121 15:50:32.255152 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c380503c-e5d5-45c3-aeea-5997a1e792c5","Type":"ContainerStarted","Data":"4657ed64d01622ebcdfcc202787380ab5c40a80236e54e5b397453ac530b6e66"} Jan 21 15:50:32 crc kubenswrapper[4773]: I0121 15:50:32.282871 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.282846313 podStartE2EDuration="2.282846313s" podCreationTimestamp="2026-01-21 15:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:32.272360718 +0000 UTC m=+1597.196850350" watchObservedRunningTime="2026-01-21 15:50:32.282846313 +0000 UTC m=+1597.207335935" Jan 21 15:50:33 crc kubenswrapper[4773]: I0121 15:50:33.618445 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 15:50:33 crc kubenswrapper[4773]: I0121 15:50:33.619051 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d7552a02-8d95-4fce-b6b0-7bbac761ad35" containerName="kube-state-metrics" containerID="cri-o://d6d7bd2ee2da7ff8ab8404974ba9925a73d46790cc3cfe296164f20446bdd932" gracePeriod=30 Jan 21 15:50:34 crc kubenswrapper[4773]: I0121 15:50:34.290828 4773 generic.go:334] "Generic (PLEG): container finished" podID="d7552a02-8d95-4fce-b6b0-7bbac761ad35" containerID="d6d7bd2ee2da7ff8ab8404974ba9925a73d46790cc3cfe296164f20446bdd932" exitCode=2 Jan 21 15:50:34 crc kubenswrapper[4773]: I0121 15:50:34.291108 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7552a02-8d95-4fce-b6b0-7bbac761ad35","Type":"ContainerDied","Data":"d6d7bd2ee2da7ff8ab8404974ba9925a73d46790cc3cfe296164f20446bdd932"} Jan 21 15:50:34 crc kubenswrapper[4773]: I0121 15:50:34.408043 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 15:50:34 crc kubenswrapper[4773]: I0121 15:50:34.409615 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 15:50:34 crc kubenswrapper[4773]: I0121 15:50:34.409764 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 15:50:34 crc kubenswrapper[4773]: I0121 15:50:34.415004 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 15:50:34 crc kubenswrapper[4773]: I0121 15:50:34.568379 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 15:50:34 crc kubenswrapper[4773]: I0121 15:50:34.750959 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rql5b\" (UniqueName: \"kubernetes.io/projected/d7552a02-8d95-4fce-b6b0-7bbac761ad35-kube-api-access-rql5b\") pod \"d7552a02-8d95-4fce-b6b0-7bbac761ad35\" (UID: \"d7552a02-8d95-4fce-b6b0-7bbac761ad35\") " Jan 21 15:50:34 crc kubenswrapper[4773]: I0121 15:50:34.762269 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7552a02-8d95-4fce-b6b0-7bbac761ad35-kube-api-access-rql5b" (OuterVolumeSpecName: "kube-api-access-rql5b") pod "d7552a02-8d95-4fce-b6b0-7bbac761ad35" (UID: "d7552a02-8d95-4fce-b6b0-7bbac761ad35"). InnerVolumeSpecName "kube-api-access-rql5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:34 crc kubenswrapper[4773]: I0121 15:50:34.853019 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rql5b\" (UniqueName: \"kubernetes.io/projected/d7552a02-8d95-4fce-b6b0-7bbac761ad35-kube-api-access-rql5b\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.302165 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.302161 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7552a02-8d95-4fce-b6b0-7bbac761ad35","Type":"ContainerDied","Data":"d415121dfbc3351ecc78b0b2eb2e655032eded61af7aa1eac930bfda353d68e4"} Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.302669 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.302716 4773 scope.go:117] "RemoveContainer" containerID="d6d7bd2ee2da7ff8ab8404974ba9925a73d46790cc3cfe296164f20446bdd932" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.344444 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.359407 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.397260 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7552a02-8d95-4fce-b6b0-7bbac761ad35" path="/var/lib/kubelet/pods/d7552a02-8d95-4fce-b6b0-7bbac761ad35/volumes" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.400116 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 15:50:35 crc kubenswrapper[4773]: E0121 15:50:35.401223 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7552a02-8d95-4fce-b6b0-7bbac761ad35" containerName="kube-state-metrics" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.401242 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7552a02-8d95-4fce-b6b0-7bbac761ad35" containerName="kube-state-metrics" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.401664 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7552a02-8d95-4fce-b6b0-7bbac761ad35" containerName="kube-state-metrics" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.403275 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.411577 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.412098 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.421764 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.460216 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.568666 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdcq7\" (UniqueName: \"kubernetes.io/projected/0182c704-9c2c-460e-8fb3-083edaa77855-kube-api-access-wdcq7\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.569285 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0182c704-9c2c-460e-8fb3-083edaa77855-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.569453 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0182c704-9c2c-460e-8fb3-083edaa77855-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.569532 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0182c704-9c2c-460e-8fb3-083edaa77855-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.608150 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.631758 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8g2hn"] Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.638657 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.657041 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8g2hn"] Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.673538 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdcq7\" (UniqueName: \"kubernetes.io/projected/0182c704-9c2c-460e-8fb3-083edaa77855-kube-api-access-wdcq7\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.673634 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0182c704-9c2c-460e-8fb3-083edaa77855-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.673800 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0182c704-9c2c-460e-8fb3-083edaa77855-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.673830 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0182c704-9c2c-460e-8fb3-083edaa77855-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.684032 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0182c704-9c2c-460e-8fb3-083edaa77855-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.686402 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0182c704-9c2c-460e-8fb3-083edaa77855-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.688444 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0182c704-9c2c-460e-8fb3-083edaa77855-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.732370 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdcq7\" (UniqueName: \"kubernetes.io/projected/0182c704-9c2c-460e-8fb3-083edaa77855-kube-api-access-wdcq7\") pod \"kube-state-metrics-0\" (UID: \"0182c704-9c2c-460e-8fb3-083edaa77855\") " pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.771055 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.780275 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.780344 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.780403 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-config\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.780439 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6zp\" (UniqueName: \"kubernetes.io/projected/83e85c33-f0be-4571-b131-5991f5ae6979-kube-api-access-rn6zp\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.780526 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.780603 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.882840 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6zp\" (UniqueName: \"kubernetes.io/projected/83e85c33-f0be-4571-b131-5991f5ae6979-kube-api-access-rn6zp\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.882932 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.882993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.883092 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.883137 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.883202 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-config\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.884098 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.885187 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-config\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.886115 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.887461 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.887923 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.912167 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6zp\" (UniqueName: \"kubernetes.io/projected/83e85c33-f0be-4571-b131-5991f5ae6979-kube-api-access-rn6zp\") pod \"dnsmasq-dns-5fd9b586ff-8g2hn\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:35 crc kubenswrapper[4773]: I0121 15:50:35.981985 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:36 crc kubenswrapper[4773]: I0121 15:50:36.326583 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 15:50:36 crc kubenswrapper[4773]: I0121 15:50:36.348945 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:50:36 crc kubenswrapper[4773]: I0121 15:50:36.414840 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:36 crc kubenswrapper[4773]: I0121 15:50:36.415099 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="ceilometer-central-agent" containerID="cri-o://c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544" gracePeriod=30 Jan 21 15:50:36 crc kubenswrapper[4773]: I0121 15:50:36.415587 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="proxy-httpd" containerID="cri-o://9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9" gracePeriod=30 Jan 21 15:50:36 crc kubenswrapper[4773]: I0121 15:50:36.415652 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="sg-core" containerID="cri-o://45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489" gracePeriod=30 Jan 21 15:50:36 crc kubenswrapper[4773]: I0121 15:50:36.415714 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="ceilometer-notification-agent" containerID="cri-o://663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be" gracePeriod=30 Jan 21 15:50:36 crc kubenswrapper[4773]: I0121 15:50:36.574718 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8g2hn"] Jan 21 15:50:36 crc kubenswrapper[4773]: W0121 15:50:36.583143 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83e85c33_f0be_4571_b131_5991f5ae6979.slice/crio-39f5031adacc7c1d8df374bdef745970745795a22000c4dbd1521782bea75d0f WatchSource:0}: Error finding container 39f5031adacc7c1d8df374bdef745970745795a22000c4dbd1521782bea75d0f: Status 404 returned error can't find the container with id 39f5031adacc7c1d8df374bdef745970745795a22000c4dbd1521782bea75d0f Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.339972 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0182c704-9c2c-460e-8fb3-083edaa77855","Type":"ContainerStarted","Data":"899109fc8dcea42e7c13ce64d63bb52aa84d8d9433d506d17398109b1ba5a043"} Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.340795 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.340817 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0182c704-9c2c-460e-8fb3-083edaa77855","Type":"ContainerStarted","Data":"2b539386eee982a5bb0288ddec51ef58919f012c4a1e804c3c7a6f12d1902db7"} Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.343820 4773 generic.go:334] "Generic (PLEG): container finished" podID="9e315c04-30b5-402f-8863-6612cb639a19" containerID="9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9" exitCode=0 Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.344101 4773 generic.go:334] "Generic (PLEG): container finished" podID="9e315c04-30b5-402f-8863-6612cb639a19" containerID="45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489" exitCode=2 Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.344170 4773 generic.go:334] "Generic (PLEG): container finished" podID="9e315c04-30b5-402f-8863-6612cb639a19" containerID="c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544" exitCode=0 Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.343906 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e315c04-30b5-402f-8863-6612cb639a19","Type":"ContainerDied","Data":"9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9"} Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.344306 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e315c04-30b5-402f-8863-6612cb639a19","Type":"ContainerDied","Data":"45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489"} Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.344340 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e315c04-30b5-402f-8863-6612cb639a19","Type":"ContainerDied","Data":"c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544"} Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.346743 4773 generic.go:334] "Generic (PLEG): container finished" podID="83e85c33-f0be-4571-b131-5991f5ae6979" containerID="b49a8bb86ec7aae6be2d853682786210a618922c394e9afffcfb6b91726c12a4" exitCode=0 Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.348910 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" event={"ID":"83e85c33-f0be-4571-b131-5991f5ae6979","Type":"ContainerDied","Data":"b49a8bb86ec7aae6be2d853682786210a618922c394e9afffcfb6b91726c12a4"} Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.348985 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" event={"ID":"83e85c33-f0be-4571-b131-5991f5ae6979","Type":"ContainerStarted","Data":"39f5031adacc7c1d8df374bdef745970745795a22000c4dbd1521782bea75d0f"} Jan 21 15:50:37 crc kubenswrapper[4773]: I0121 15:50:37.367814 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.9974471820000002 podStartE2EDuration="2.367793415s" podCreationTimestamp="2026-01-21 15:50:35 +0000 UTC" firstStartedPulling="2026-01-21 15:50:36.347608535 +0000 UTC m=+1601.272098157" lastFinishedPulling="2026-01-21 15:50:36.717954768 +0000 UTC m=+1601.642444390" observedRunningTime="2026-01-21 15:50:37.361531874 +0000 UTC m=+1602.286021496" watchObservedRunningTime="2026-01-21 15:50:37.367793415 +0000 UTC m=+1602.292283037" Jan 21 15:50:38 crc kubenswrapper[4773]: I0121 15:50:38.347568 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:38 crc kubenswrapper[4773]: I0121 15:50:38.357672 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerName="nova-api-log" containerID="cri-o://7392e4eb01cc80c060ba1fd04c53c1ab953d9c42cde45c76526288a5aa78e95b" gracePeriod=30 Jan 21 15:50:38 crc kubenswrapper[4773]: I0121 15:50:38.357791 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerName="nova-api-api" containerID="cri-o://6b79eadcd33a08475413160715f0d61e55bf5173812336e0cb9058ead546a5f5" gracePeriod=30 Jan 21 15:50:40 crc kubenswrapper[4773]: I0121 15:50:40.604517 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:40 crc kubenswrapper[4773]: I0121 15:50:40.622873 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.399880 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.589636 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-9s9qp"] Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.593867 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.596278 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.596548 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.610554 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-9s9qp"] Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.634101 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk9n9\" (UniqueName: \"kubernetes.io/projected/0d70a183-ee4f-41e0-9b98-1f921165cecb-kube-api-access-bk9n9\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.634175 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.634228 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-config-data\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.634267 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-scripts\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.736922 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-config-data\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.737045 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-scripts\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.737233 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk9n9\" (UniqueName: \"kubernetes.io/projected/0d70a183-ee4f-41e0-9b98-1f921165cecb-kube-api-access-bk9n9\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.737291 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.744765 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.745060 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-scripts\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.750403 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-config-data\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.756674 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk9n9\" (UniqueName: \"kubernetes.io/projected/0d70a183-ee4f-41e0-9b98-1f921165cecb-kube-api-access-bk9n9\") pod \"nova-cell1-cell-mapping-9s9qp\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:41 crc kubenswrapper[4773]: I0121 15:50:41.947565 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.270532 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.353377 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-scripts\") pod \"9e315c04-30b5-402f-8863-6612cb639a19\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.353520 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-combined-ca-bundle\") pod \"9e315c04-30b5-402f-8863-6612cb639a19\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.353625 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl4sn\" (UniqueName: \"kubernetes.io/projected/9e315c04-30b5-402f-8863-6612cb639a19-kube-api-access-jl4sn\") pod \"9e315c04-30b5-402f-8863-6612cb639a19\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.354353 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-config-data\") pod \"9e315c04-30b5-402f-8863-6612cb639a19\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.354566 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-log-httpd\") pod \"9e315c04-30b5-402f-8863-6612cb639a19\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.355642 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-run-httpd\") pod \"9e315c04-30b5-402f-8863-6612cb639a19\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.355987 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-sg-core-conf-yaml\") pod \"9e315c04-30b5-402f-8863-6612cb639a19\" (UID: \"9e315c04-30b5-402f-8863-6612cb639a19\") " Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.356283 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9e315c04-30b5-402f-8863-6612cb639a19" (UID: "9e315c04-30b5-402f-8863-6612cb639a19"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.359050 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9e315c04-30b5-402f-8863-6612cb639a19" (UID: "9e315c04-30b5-402f-8863-6612cb639a19"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.361338 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-scripts" (OuterVolumeSpecName: "scripts") pod "9e315c04-30b5-402f-8863-6612cb639a19" (UID: "9e315c04-30b5-402f-8863-6612cb639a19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.362032 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e315c04-30b5-402f-8863-6612cb639a19-kube-api-access-jl4sn" (OuterVolumeSpecName: "kube-api-access-jl4sn") pod "9e315c04-30b5-402f-8863-6612cb639a19" (UID: "9e315c04-30b5-402f-8863-6612cb639a19"). InnerVolumeSpecName "kube-api-access-jl4sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.363065 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.363101 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.363115 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl4sn\" (UniqueName: \"kubernetes.io/projected/9e315c04-30b5-402f-8863-6612cb639a19-kube-api-access-jl4sn\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.363128 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e315c04-30b5-402f-8863-6612cb639a19-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.402346 4773 generic.go:334] "Generic (PLEG): container finished" podID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerID="6b79eadcd33a08475413160715f0d61e55bf5173812336e0cb9058ead546a5f5" exitCode=0 Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.402388 4773 generic.go:334] "Generic (PLEG): container finished" podID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerID="7392e4eb01cc80c060ba1fd04c53c1ab953d9c42cde45c76526288a5aa78e95b" exitCode=143 Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.402435 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d4108e9-0cf1-45fb-9b25-6718091cb0f3","Type":"ContainerDied","Data":"6b79eadcd33a08475413160715f0d61e55bf5173812336e0cb9058ead546a5f5"} Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.402466 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d4108e9-0cf1-45fb-9b25-6718091cb0f3","Type":"ContainerDied","Data":"7392e4eb01cc80c060ba1fd04c53c1ab953d9c42cde45c76526288a5aa78e95b"} Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.414893 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9e315c04-30b5-402f-8863-6612cb639a19" (UID: "9e315c04-30b5-402f-8863-6612cb639a19"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.426729 4773 generic.go:334] "Generic (PLEG): container finished" podID="9e315c04-30b5-402f-8863-6612cb639a19" containerID="663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be" exitCode=0 Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.426897 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e315c04-30b5-402f-8863-6612cb639a19","Type":"ContainerDied","Data":"663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be"} Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.426960 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e315c04-30b5-402f-8863-6612cb639a19","Type":"ContainerDied","Data":"0a2e38418106c05f2878c31c9c1cef080dc0899940a425a29f476e202064300c"} Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.426982 4773 scope.go:117] "RemoveContainer" containerID="9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.427192 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.442076 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" event={"ID":"83e85c33-f0be-4571-b131-5991f5ae6979","Type":"ContainerStarted","Data":"c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a"} Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.442224 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.470106 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" podStartSLOduration=7.47008645 podStartE2EDuration="7.47008645s" podCreationTimestamp="2026-01-21 15:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:42.463025227 +0000 UTC m=+1607.387514859" watchObservedRunningTime="2026-01-21 15:50:42.47008645 +0000 UTC m=+1607.394576072" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.480798 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.480970 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e315c04-30b5-402f-8863-6612cb639a19" (UID: "9e315c04-30b5-402f-8863-6612cb639a19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.497460 4773 scope.go:117] "RemoveContainer" containerID="45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.536165 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-9s9qp"] Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.568719 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-config-data" (OuterVolumeSpecName: "config-data") pod "9e315c04-30b5-402f-8863-6612cb639a19" (UID: "9e315c04-30b5-402f-8863-6612cb639a19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.572374 4773 scope.go:117] "RemoveContainer" containerID="663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.603514 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.603534 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e315c04-30b5-402f-8863-6612cb639a19-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.798179 4773 scope.go:117] "RemoveContainer" containerID="c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.841352 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.845271 4773 scope.go:117] "RemoveContainer" containerID="9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9" Jan 21 15:50:42 crc kubenswrapper[4773]: E0121 15:50:42.846069 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9\": container with ID starting with 9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9 not found: ID does not exist" containerID="9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.846118 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9"} err="failed to get container status \"9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9\": rpc error: code = NotFound desc = could not find container \"9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9\": container with ID starting with 9267bff28e4fbe877c32306d8e2e23738c7f17f044a942885080bfcff530f4e9 not found: ID does not exist" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.846161 4773 scope.go:117] "RemoveContainer" containerID="45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489" Jan 21 15:50:42 crc kubenswrapper[4773]: E0121 15:50:42.846535 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489\": container with ID starting with 45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489 not found: ID does not exist" containerID="45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.846568 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489"} err="failed to get container status \"45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489\": rpc error: code = NotFound desc = could not find container \"45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489\": container with ID starting with 45a2bf8ccb7833dbf2f11be3258af9b3364bb501857212a540e1e1a6364ce489 not found: ID does not exist" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.846588 4773 scope.go:117] "RemoveContainer" containerID="663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be" Jan 21 15:50:42 crc kubenswrapper[4773]: E0121 15:50:42.846886 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be\": container with ID starting with 663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be not found: ID does not exist" containerID="663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.846916 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be"} err="failed to get container status \"663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be\": rpc error: code = NotFound desc = could not find container \"663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be\": container with ID starting with 663107bb5109f1921124df7fc0c358c3d7e9402786332053a9d5be07510c15be not found: ID does not exist" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.846936 4773 scope.go:117] "RemoveContainer" containerID="c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544" Jan 21 15:50:42 crc kubenswrapper[4773]: E0121 15:50:42.847225 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544\": container with ID starting with c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544 not found: ID does not exist" containerID="c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.847250 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544"} err="failed to get container status \"c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544\": rpc error: code = NotFound desc = could not find container \"c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544\": container with ID starting with c26c6d2df9bd9dfde2d17097e1921d04691949a382e769b1fbf1da9c20057544 not found: ID does not exist" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.857327 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.879864 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:42 crc kubenswrapper[4773]: E0121 15:50:42.880988 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="ceilometer-notification-agent" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.881059 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="ceilometer-notification-agent" Jan 21 15:50:42 crc kubenswrapper[4773]: E0121 15:50:42.881145 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="sg-core" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.881198 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="sg-core" Jan 21 15:50:42 crc kubenswrapper[4773]: E0121 15:50:42.881262 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="proxy-httpd" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.881309 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="proxy-httpd" Jan 21 15:50:42 crc kubenswrapper[4773]: E0121 15:50:42.881373 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="ceilometer-central-agent" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.881577 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="ceilometer-central-agent" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.881991 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="proxy-httpd" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.882054 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="ceilometer-central-agent" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.882104 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="ceilometer-notification-agent" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.882169 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e315c04-30b5-402f-8863-6612cb639a19" containerName="sg-core" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.901665 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.905093 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.908016 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.908339 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.908493 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 15:50:42 crc kubenswrapper[4773]: I0121 15:50:42.980078 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.023345 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-scripts\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.023406 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.023584 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.023742 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-run-httpd\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.023819 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrphw\" (UniqueName: \"kubernetes.io/projected/b5be80f2-5a0f-41c0-b57d-09011f1785c0-kube-api-access-hrphw\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.023867 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-log-httpd\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.023910 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-config-data\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.027726 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.133831 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-logs\") pod \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.133933 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdmxf\" (UniqueName: \"kubernetes.io/projected/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-kube-api-access-jdmxf\") pod \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.134298 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-combined-ca-bundle\") pod \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.134377 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-config-data\") pod \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\" (UID: \"9d4108e9-0cf1-45fb-9b25-6718091cb0f3\") " Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.134827 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-scripts\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.134884 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.134982 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.135047 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-run-httpd\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.135087 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrphw\" (UniqueName: \"kubernetes.io/projected/b5be80f2-5a0f-41c0-b57d-09011f1785c0-kube-api-access-hrphw\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.135153 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-log-httpd\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.135228 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-config-data\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.135254 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.138394 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-run-httpd\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.138681 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-log-httpd\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.142844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-logs" (OuterVolumeSpecName: "logs") pod "9d4108e9-0cf1-45fb-9b25-6718091cb0f3" (UID: "9d4108e9-0cf1-45fb-9b25-6718091cb0f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.144122 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-config-data\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.144870 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.145319 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.145370 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-scripts\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.155152 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-kube-api-access-jdmxf" (OuterVolumeSpecName: "kube-api-access-jdmxf") pod "9d4108e9-0cf1-45fb-9b25-6718091cb0f3" (UID: "9d4108e9-0cf1-45fb-9b25-6718091cb0f3"). InnerVolumeSpecName "kube-api-access-jdmxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.156559 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.160434 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrphw\" (UniqueName: \"kubernetes.io/projected/b5be80f2-5a0f-41c0-b57d-09011f1785c0-kube-api-access-hrphw\") pod \"ceilometer-0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.174103 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d4108e9-0cf1-45fb-9b25-6718091cb0f3" (UID: "9d4108e9-0cf1-45fb-9b25-6718091cb0f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.183505 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-config-data" (OuterVolumeSpecName: "config-data") pod "9d4108e9-0cf1-45fb-9b25-6718091cb0f3" (UID: "9d4108e9-0cf1-45fb-9b25-6718091cb0f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.237426 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.237487 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.237501 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.237513 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdmxf\" (UniqueName: \"kubernetes.io/projected/9d4108e9-0cf1-45fb-9b25-6718091cb0f3-kube-api-access-jdmxf\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.292600 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.400618 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e315c04-30b5-402f-8863-6612cb639a19" path="/var/lib/kubelet/pods/9e315c04-30b5-402f-8863-6612cb639a19/volumes" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.459072 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d4108e9-0cf1-45fb-9b25-6718091cb0f3","Type":"ContainerDied","Data":"01028f46acc1a744be5a624165c12a0725bd0cc16b3747a078463b05a4df711d"} Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.459134 4773 scope.go:117] "RemoveContainer" containerID="6b79eadcd33a08475413160715f0d61e55bf5173812336e0cb9058ead546a5f5" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.459093 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.480056 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9s9qp" event={"ID":"0d70a183-ee4f-41e0-9b98-1f921165cecb","Type":"ContainerStarted","Data":"997a78c8b2f79aa75b335c354cd6fcb926298a3d056aeba12a5a77a363706460"} Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.480144 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9s9qp" event={"ID":"0d70a183-ee4f-41e0-9b98-1f921165cecb","Type":"ContainerStarted","Data":"7a91b36cd569e347a8c9eed6cd9667f8d2a0748edf734a79db7534f0e7816834"} Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.525213 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.532407 4773 scope.go:117] "RemoveContainer" containerID="7392e4eb01cc80c060ba1fd04c53c1ab953d9c42cde45c76526288a5aa78e95b" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.559568 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.560980 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-9s9qp" podStartSLOduration=2.5609636780000002 podStartE2EDuration="2.560963678s" podCreationTimestamp="2026-01-21 15:50:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:43.509476953 +0000 UTC m=+1608.433966565" watchObservedRunningTime="2026-01-21 15:50:43.560963678 +0000 UTC m=+1608.485453300" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.593763 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:43 crc kubenswrapper[4773]: E0121 15:50:43.594273 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerName="nova-api-api" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.594490 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerName="nova-api-api" Jan 21 15:50:43 crc kubenswrapper[4773]: E0121 15:50:43.594511 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerName="nova-api-log" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.594520 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerName="nova-api-log" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.594768 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerName="nova-api-log" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.594790 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" containerName="nova-api-api" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.596029 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.600215 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.600419 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.600573 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.608379 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.751438 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-internal-tls-certs\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.751830 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.752016 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhvzz\" (UniqueName: \"kubernetes.io/projected/707677b6-625c-4504-bc87-efef3e08b410-kube-api-access-xhvzz\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.752050 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-public-tls-certs\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.752100 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/707677b6-625c-4504-bc87-efef3e08b410-logs\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.752283 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-config-data\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.816079 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.853924 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhvzz\" (UniqueName: \"kubernetes.io/projected/707677b6-625c-4504-bc87-efef3e08b410-kube-api-access-xhvzz\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.853988 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-public-tls-certs\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.854024 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/707677b6-625c-4504-bc87-efef3e08b410-logs\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.854127 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-config-data\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.854153 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-internal-tls-certs\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.854181 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.854821 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/707677b6-625c-4504-bc87-efef3e08b410-logs\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.859550 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-config-data\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.859622 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-public-tls-certs\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.859730 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.876359 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhvzz\" (UniqueName: \"kubernetes.io/projected/707677b6-625c-4504-bc87-efef3e08b410-kube-api-access-xhvzz\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.876398 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-internal-tls-certs\") pod \"nova-api-0\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " pod="openstack/nova-api-0" Jan 21 15:50:43 crc kubenswrapper[4773]: I0121 15:50:43.955576 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:44 crc kubenswrapper[4773]: I0121 15:50:44.476902 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:44 crc kubenswrapper[4773]: I0121 15:50:44.493145 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:44 crc kubenswrapper[4773]: I0121 15:50:44.496327 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5be80f2-5a0f-41c0-b57d-09011f1785c0","Type":"ContainerStarted","Data":"742cb69f82387598354c63b952c34727f79e9305911bf5bdfa69b3ef25de9e66"} Jan 21 15:50:44 crc kubenswrapper[4773]: I0121 15:50:44.498305 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"707677b6-625c-4504-bc87-efef3e08b410","Type":"ContainerStarted","Data":"3fcc026a690448de3077cad9185a1fe2b281cb5093f077e1c44fe29373bf3b9a"} Jan 21 15:50:45 crc kubenswrapper[4773]: I0121 15:50:45.397534 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4108e9-0cf1-45fb-9b25-6718091cb0f3" path="/var/lib/kubelet/pods/9d4108e9-0cf1-45fb-9b25-6718091cb0f3/volumes" Jan 21 15:50:45 crc kubenswrapper[4773]: I0121 15:50:45.532068 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5be80f2-5a0f-41c0-b57d-09011f1785c0","Type":"ContainerStarted","Data":"84f941db71fa0d2ea9aa324164879bf71bd7753f3f649e9b488491284ca3347a"} Jan 21 15:50:45 crc kubenswrapper[4773]: I0121 15:50:45.532116 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5be80f2-5a0f-41c0-b57d-09011f1785c0","Type":"ContainerStarted","Data":"22d15257331d33f255d9a137c3424a8e49e24dc9327ff48a28741f525b39bd8e"} Jan 21 15:50:45 crc kubenswrapper[4773]: I0121 15:50:45.534888 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"707677b6-625c-4504-bc87-efef3e08b410","Type":"ContainerStarted","Data":"a2ff71cd26cf990c3b205f8a5497b2064ab4cfb6a384421358e6a8bedd4522fb"} Jan 21 15:50:45 crc kubenswrapper[4773]: I0121 15:50:45.534944 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"707677b6-625c-4504-bc87-efef3e08b410","Type":"ContainerStarted","Data":"03690e62deccdc8e18079af67baa9cb59f69c484e1146e09eea801f207b7c6c7"} Jan 21 15:50:45 crc kubenswrapper[4773]: I0121 15:50:45.566003 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5659778920000003 podStartE2EDuration="2.565977892s" podCreationTimestamp="2026-01-21 15:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:45.558148028 +0000 UTC m=+1610.482637670" watchObservedRunningTime="2026-01-21 15:50:45.565977892 +0000 UTC m=+1610.490467514" Jan 21 15:50:45 crc kubenswrapper[4773]: I0121 15:50:45.781107 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 21 15:50:46 crc kubenswrapper[4773]: I0121 15:50:46.557080 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5be80f2-5a0f-41c0-b57d-09011f1785c0","Type":"ContainerStarted","Data":"be5ce62e7f48d2131fa2d0ff115b5c68dbf3a4998206ed92b36ac43fd88724e4"} Jan 21 15:50:48 crc kubenswrapper[4773]: I0121 15:50:48.577747 4773 generic.go:334] "Generic (PLEG): container finished" podID="0d70a183-ee4f-41e0-9b98-1f921165cecb" containerID="997a78c8b2f79aa75b335c354cd6fcb926298a3d056aeba12a5a77a363706460" exitCode=0 Jan 21 15:50:48 crc kubenswrapper[4773]: I0121 15:50:48.577898 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9s9qp" event={"ID":"0d70a183-ee4f-41e0-9b98-1f921165cecb","Type":"ContainerDied","Data":"997a78c8b2f79aa75b335c354cd6fcb926298a3d056aeba12a5a77a363706460"} Jan 21 15:50:48 crc kubenswrapper[4773]: I0121 15:50:48.581707 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5be80f2-5a0f-41c0-b57d-09011f1785c0","Type":"ContainerStarted","Data":"8aa6ad0cd9e64fd43371c674fce36d0c401ad1c68ddf6091555f385ba284e89e"} Jan 21 15:50:48 crc kubenswrapper[4773]: I0121 15:50:48.581800 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="ceilometer-central-agent" containerID="cri-o://22d15257331d33f255d9a137c3424a8e49e24dc9327ff48a28741f525b39bd8e" gracePeriod=30 Jan 21 15:50:48 crc kubenswrapper[4773]: I0121 15:50:48.581846 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="sg-core" containerID="cri-o://be5ce62e7f48d2131fa2d0ff115b5c68dbf3a4998206ed92b36ac43fd88724e4" gracePeriod=30 Jan 21 15:50:48 crc kubenswrapper[4773]: I0121 15:50:48.581875 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="ceilometer-notification-agent" containerID="cri-o://84f941db71fa0d2ea9aa324164879bf71bd7753f3f649e9b488491284ca3347a" gracePeriod=30 Jan 21 15:50:48 crc kubenswrapper[4773]: I0121 15:50:48.581876 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="proxy-httpd" containerID="cri-o://8aa6ad0cd9e64fd43371c674fce36d0c401ad1c68ddf6091555f385ba284e89e" gracePeriod=30 Jan 21 15:50:48 crc kubenswrapper[4773]: I0121 15:50:48.581970 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 15:50:48 crc kubenswrapper[4773]: I0121 15:50:48.628571 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.130382238 podStartE2EDuration="6.628553795s" podCreationTimestamp="2026-01-21 15:50:42 +0000 UTC" firstStartedPulling="2026-01-21 15:50:43.822014298 +0000 UTC m=+1608.746503920" lastFinishedPulling="2026-01-21 15:50:47.320185855 +0000 UTC m=+1612.244675477" observedRunningTime="2026-01-21 15:50:48.62031273 +0000 UTC m=+1613.544802372" watchObservedRunningTime="2026-01-21 15:50:48.628553795 +0000 UTC m=+1613.553043417" Jan 21 15:50:49 crc kubenswrapper[4773]: I0121 15:50:49.595144 4773 generic.go:334] "Generic (PLEG): container finished" podID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerID="8aa6ad0cd9e64fd43371c674fce36d0c401ad1c68ddf6091555f385ba284e89e" exitCode=0 Jan 21 15:50:49 crc kubenswrapper[4773]: I0121 15:50:49.595448 4773 generic.go:334] "Generic (PLEG): container finished" podID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerID="be5ce62e7f48d2131fa2d0ff115b5c68dbf3a4998206ed92b36ac43fd88724e4" exitCode=2 Jan 21 15:50:49 crc kubenswrapper[4773]: I0121 15:50:49.595460 4773 generic.go:334] "Generic (PLEG): container finished" podID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerID="84f941db71fa0d2ea9aa324164879bf71bd7753f3f649e9b488491284ca3347a" exitCode=0 Jan 21 15:50:49 crc kubenswrapper[4773]: I0121 15:50:49.595204 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5be80f2-5a0f-41c0-b57d-09011f1785c0","Type":"ContainerDied","Data":"8aa6ad0cd9e64fd43371c674fce36d0c401ad1c68ddf6091555f385ba284e89e"} Jan 21 15:50:49 crc kubenswrapper[4773]: I0121 15:50:49.595532 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5be80f2-5a0f-41c0-b57d-09011f1785c0","Type":"ContainerDied","Data":"be5ce62e7f48d2131fa2d0ff115b5c68dbf3a4998206ed92b36ac43fd88724e4"} Jan 21 15:50:49 crc kubenswrapper[4773]: I0121 15:50:49.595549 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5be80f2-5a0f-41c0-b57d-09011f1785c0","Type":"ContainerDied","Data":"84f941db71fa0d2ea9aa324164879bf71bd7753f3f649e9b488491284ca3347a"} Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.081724 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.206397 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-config-data\") pod \"0d70a183-ee4f-41e0-9b98-1f921165cecb\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.206451 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk9n9\" (UniqueName: \"kubernetes.io/projected/0d70a183-ee4f-41e0-9b98-1f921165cecb-kube-api-access-bk9n9\") pod \"0d70a183-ee4f-41e0-9b98-1f921165cecb\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.206596 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-scripts\") pod \"0d70a183-ee4f-41e0-9b98-1f921165cecb\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.206888 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-combined-ca-bundle\") pod \"0d70a183-ee4f-41e0-9b98-1f921165cecb\" (UID: \"0d70a183-ee4f-41e0-9b98-1f921165cecb\") " Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.213842 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d70a183-ee4f-41e0-9b98-1f921165cecb-kube-api-access-bk9n9" (OuterVolumeSpecName: "kube-api-access-bk9n9") pod "0d70a183-ee4f-41e0-9b98-1f921165cecb" (UID: "0d70a183-ee4f-41e0-9b98-1f921165cecb"). InnerVolumeSpecName "kube-api-access-bk9n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.215166 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-scripts" (OuterVolumeSpecName: "scripts") pod "0d70a183-ee4f-41e0-9b98-1f921165cecb" (UID: "0d70a183-ee4f-41e0-9b98-1f921165cecb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.239258 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d70a183-ee4f-41e0-9b98-1f921165cecb" (UID: "0d70a183-ee4f-41e0-9b98-1f921165cecb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.249957 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-config-data" (OuterVolumeSpecName: "config-data") pod "0d70a183-ee4f-41e0-9b98-1f921165cecb" (UID: "0d70a183-ee4f-41e0-9b98-1f921165cecb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.310661 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.310800 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.310867 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk9n9\" (UniqueName: \"kubernetes.io/projected/0d70a183-ee4f-41e0-9b98-1f921165cecb-kube-api-access-bk9n9\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.310923 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d70a183-ee4f-41e0-9b98-1f921165cecb-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.624037 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9s9qp" event={"ID":"0d70a183-ee4f-41e0-9b98-1f921165cecb","Type":"ContainerDied","Data":"7a91b36cd569e347a8c9eed6cd9667f8d2a0748edf734a79db7534f0e7816834"} Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.624099 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a91b36cd569e347a8c9eed6cd9667f8d2a0748edf734a79db7534f0e7816834" Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.624222 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9s9qp" Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.792594 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.793426 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="707677b6-625c-4504-bc87-efef3e08b410" containerName="nova-api-log" containerID="cri-o://03690e62deccdc8e18079af67baa9cb59f69c484e1146e09eea801f207b7c6c7" gracePeriod=30 Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.793466 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="707677b6-625c-4504-bc87-efef3e08b410" containerName="nova-api-api" containerID="cri-o://a2ff71cd26cf990c3b205f8a5497b2064ab4cfb6a384421358e6a8bedd4522fb" gracePeriod=30 Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.821056 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.821482 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="986374f1-375b-4346-bde4-7db28c6f1f4e" containerName="nova-scheduler-scheduler" containerID="cri-o://5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f" gracePeriod=30 Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.842426 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.842674 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerName="nova-metadata-log" containerID="cri-o://d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e" gracePeriod=30 Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.842791 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerName="nova-metadata-metadata" containerID="cri-o://ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0" gracePeriod=30 Jan 21 15:50:50 crc kubenswrapper[4773]: I0121 15:50:50.983937 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.055079 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-fqhnk"] Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.055369 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" podUID="99ed7666-e174-4d86-931c-4c04712d5a26" containerName="dnsmasq-dns" containerID="cri-o://e467aab9eb9f012dc0c49a73e1deb2c634e2e38ffe3f84433aa6cb84228eefcc" gracePeriod=10 Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.640022 4773 generic.go:334] "Generic (PLEG): container finished" podID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerID="22d15257331d33f255d9a137c3424a8e49e24dc9327ff48a28741f525b39bd8e" exitCode=0 Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.640577 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5be80f2-5a0f-41c0-b57d-09011f1785c0","Type":"ContainerDied","Data":"22d15257331d33f255d9a137c3424a8e49e24dc9327ff48a28741f525b39bd8e"} Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.643815 4773 generic.go:334] "Generic (PLEG): container finished" podID="707677b6-625c-4504-bc87-efef3e08b410" containerID="a2ff71cd26cf990c3b205f8a5497b2064ab4cfb6a384421358e6a8bedd4522fb" exitCode=0 Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.643836 4773 generic.go:334] "Generic (PLEG): container finished" podID="707677b6-625c-4504-bc87-efef3e08b410" containerID="03690e62deccdc8e18079af67baa9cb59f69c484e1146e09eea801f207b7c6c7" exitCode=143 Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.643868 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"707677b6-625c-4504-bc87-efef3e08b410","Type":"ContainerDied","Data":"a2ff71cd26cf990c3b205f8a5497b2064ab4cfb6a384421358e6a8bedd4522fb"} Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.643885 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"707677b6-625c-4504-bc87-efef3e08b410","Type":"ContainerDied","Data":"03690e62deccdc8e18079af67baa9cb59f69c484e1146e09eea801f207b7c6c7"} Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.652961 4773 generic.go:334] "Generic (PLEG): container finished" podID="99ed7666-e174-4d86-931c-4c04712d5a26" containerID="e467aab9eb9f012dc0c49a73e1deb2c634e2e38ffe3f84433aa6cb84228eefcc" exitCode=0 Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.653044 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" event={"ID":"99ed7666-e174-4d86-931c-4c04712d5a26","Type":"ContainerDied","Data":"e467aab9eb9f012dc0c49a73e1deb2c634e2e38ffe3f84433aa6cb84228eefcc"} Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.656477 4773 generic.go:334] "Generic (PLEG): container finished" podID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerID="d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e" exitCode=143 Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.656515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8","Type":"ContainerDied","Data":"d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e"} Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.764999 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.858828 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlnng\" (UniqueName: \"kubernetes.io/projected/99ed7666-e174-4d86-931c-4c04712d5a26-kube-api-access-qlnng\") pod \"99ed7666-e174-4d86-931c-4c04712d5a26\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.858924 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-swift-storage-0\") pod \"99ed7666-e174-4d86-931c-4c04712d5a26\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.858998 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-nb\") pod \"99ed7666-e174-4d86-931c-4c04712d5a26\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.859223 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-config\") pod \"99ed7666-e174-4d86-931c-4c04712d5a26\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.859326 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-svc\") pod \"99ed7666-e174-4d86-931c-4c04712d5a26\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.859359 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-sb\") pod \"99ed7666-e174-4d86-931c-4c04712d5a26\" (UID: \"99ed7666-e174-4d86-931c-4c04712d5a26\") " Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.888073 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ed7666-e174-4d86-931c-4c04712d5a26-kube-api-access-qlnng" (OuterVolumeSpecName: "kube-api-access-qlnng") pod "99ed7666-e174-4d86-931c-4c04712d5a26" (UID: "99ed7666-e174-4d86-931c-4c04712d5a26"). InnerVolumeSpecName "kube-api-access-qlnng". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.929846 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "99ed7666-e174-4d86-931c-4c04712d5a26" (UID: "99ed7666-e174-4d86-931c-4c04712d5a26"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.936233 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99ed7666-e174-4d86-931c-4c04712d5a26" (UID: "99ed7666-e174-4d86-931c-4c04712d5a26"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.962338 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.962373 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlnng\" (UniqueName: \"kubernetes.io/projected/99ed7666-e174-4d86-931c-4c04712d5a26-kube-api-access-qlnng\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.962390 4773 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.964622 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99ed7666-e174-4d86-931c-4c04712d5a26" (UID: "99ed7666-e174-4d86-931c-4c04712d5a26"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.966098 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99ed7666-e174-4d86-931c-4c04712d5a26" (UID: "99ed7666-e174-4d86-931c-4c04712d5a26"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:50:51 crc kubenswrapper[4773]: I0121 15:50:51.968339 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-config" (OuterVolumeSpecName: "config") pod "99ed7666-e174-4d86-931c-4c04712d5a26" (UID: "99ed7666-e174-4d86-931c-4c04712d5a26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.064492 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.064840 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.064853 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ed7666-e174-4d86-931c-4c04712d5a26-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.190101 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.202876 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.268509 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/707677b6-625c-4504-bc87-efef3e08b410-logs\") pod \"707677b6-625c-4504-bc87-efef3e08b410\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.268565 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-run-httpd\") pod \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.268585 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-public-tls-certs\") pod \"707677b6-625c-4504-bc87-efef3e08b410\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.268728 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-ceilometer-tls-certs\") pod \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.268753 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-combined-ca-bundle\") pod \"707677b6-625c-4504-bc87-efef3e08b410\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.268800 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-config-data\") pod \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.268830 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-combined-ca-bundle\") pod \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.268910 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-log-httpd\") pod \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.268979 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrphw\" (UniqueName: \"kubernetes.io/projected/b5be80f2-5a0f-41c0-b57d-09011f1785c0-kube-api-access-hrphw\") pod \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.269022 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-config-data\") pod \"707677b6-625c-4504-bc87-efef3e08b410\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.269086 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-internal-tls-certs\") pod \"707677b6-625c-4504-bc87-efef3e08b410\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.269105 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-sg-core-conf-yaml\") pod \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.269121 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-scripts\") pod \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\" (UID: \"b5be80f2-5a0f-41c0-b57d-09011f1785c0\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.269147 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhvzz\" (UniqueName: \"kubernetes.io/projected/707677b6-625c-4504-bc87-efef3e08b410-kube-api-access-xhvzz\") pod \"707677b6-625c-4504-bc87-efef3e08b410\" (UID: \"707677b6-625c-4504-bc87-efef3e08b410\") " Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.269660 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b5be80f2-5a0f-41c0-b57d-09011f1785c0" (UID: "b5be80f2-5a0f-41c0-b57d-09011f1785c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.269952 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/707677b6-625c-4504-bc87-efef3e08b410-logs" (OuterVolumeSpecName: "logs") pod "707677b6-625c-4504-bc87-efef3e08b410" (UID: "707677b6-625c-4504-bc87-efef3e08b410"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.270256 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b5be80f2-5a0f-41c0-b57d-09011f1785c0" (UID: "b5be80f2-5a0f-41c0-b57d-09011f1785c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.278090 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5be80f2-5a0f-41c0-b57d-09011f1785c0-kube-api-access-hrphw" (OuterVolumeSpecName: "kube-api-access-hrphw") pod "b5be80f2-5a0f-41c0-b57d-09011f1785c0" (UID: "b5be80f2-5a0f-41c0-b57d-09011f1785c0"). InnerVolumeSpecName "kube-api-access-hrphw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.279002 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707677b6-625c-4504-bc87-efef3e08b410-kube-api-access-xhvzz" (OuterVolumeSpecName: "kube-api-access-xhvzz") pod "707677b6-625c-4504-bc87-efef3e08b410" (UID: "707677b6-625c-4504-bc87-efef3e08b410"). InnerVolumeSpecName "kube-api-access-xhvzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.284928 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-scripts" (OuterVolumeSpecName: "scripts") pod "b5be80f2-5a0f-41c0-b57d-09011f1785c0" (UID: "b5be80f2-5a0f-41c0-b57d-09011f1785c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.310675 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b5be80f2-5a0f-41c0-b57d-09011f1785c0" (UID: "b5be80f2-5a0f-41c0-b57d-09011f1785c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.355242 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "707677b6-625c-4504-bc87-efef3e08b410" (UID: "707677b6-625c-4504-bc87-efef3e08b410"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.364059 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-config-data" (OuterVolumeSpecName: "config-data") pod "707677b6-625c-4504-bc87-efef3e08b410" (UID: "707677b6-625c-4504-bc87-efef3e08b410"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.367908 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b5be80f2-5a0f-41c0-b57d-09011f1785c0" (UID: "b5be80f2-5a0f-41c0-b57d-09011f1785c0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.371641 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.371684 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.371718 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.371732 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhvzz\" (UniqueName: \"kubernetes.io/projected/707677b6-625c-4504-bc87-efef3e08b410-kube-api-access-xhvzz\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.371746 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.371757 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/707677b6-625c-4504-bc87-efef3e08b410-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.371769 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.371781 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.371792 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5be80f2-5a0f-41c0-b57d-09011f1785c0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.371804 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrphw\" (UniqueName: \"kubernetes.io/projected/b5be80f2-5a0f-41c0-b57d-09011f1785c0-kube-api-access-hrphw\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.376534 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5be80f2-5a0f-41c0-b57d-09011f1785c0" (UID: "b5be80f2-5a0f-41c0-b57d-09011f1785c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.382388 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "707677b6-625c-4504-bc87-efef3e08b410" (UID: "707677b6-625c-4504-bc87-efef3e08b410"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.411651 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "707677b6-625c-4504-bc87-efef3e08b410" (UID: "707677b6-625c-4504-bc87-efef3e08b410"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.448080 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-config-data" (OuterVolumeSpecName: "config-data") pod "b5be80f2-5a0f-41c0-b57d-09011f1785c0" (UID: "b5be80f2-5a0f-41c0-b57d-09011f1785c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.473815 4773 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.473859 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.473870 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5be80f2-5a0f-41c0-b57d-09011f1785c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.473883 4773 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/707677b6-625c-4504-bc87-efef3e08b410-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.669986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5be80f2-5a0f-41c0-b57d-09011f1785c0","Type":"ContainerDied","Data":"742cb69f82387598354c63b952c34727f79e9305911bf5bdfa69b3ef25de9e66"} Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.670062 4773 scope.go:117] "RemoveContainer" containerID="8aa6ad0cd9e64fd43371c674fce36d0c401ad1c68ddf6091555f385ba284e89e" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.670281 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.679202 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"707677b6-625c-4504-bc87-efef3e08b410","Type":"ContainerDied","Data":"3fcc026a690448de3077cad9185a1fe2b281cb5093f077e1c44fe29373bf3b9a"} Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.679223 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.683185 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" event={"ID":"99ed7666-e174-4d86-931c-4c04712d5a26","Type":"ContainerDied","Data":"ef3192fd6f1558b8b7b1b46a80dca8de4ff96c6029c64274c43ab78d31c1952e"} Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.683210 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-fqhnk" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.724989 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.730034 4773 scope.go:117] "RemoveContainer" containerID="be5ce62e7f48d2131fa2d0ff115b5c68dbf3a4998206ed92b36ac43fd88724e4" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.752825 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.769098 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.772929 4773 scope.go:117] "RemoveContainer" containerID="84f941db71fa0d2ea9aa324164879bf71bd7753f3f649e9b488491284ca3347a" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.789500 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.799080 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:52 crc kubenswrapper[4773]: E0121 15:50:52.799804 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ed7666-e174-4d86-931c-4c04712d5a26" containerName="dnsmasq-dns" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.799819 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ed7666-e174-4d86-931c-4c04712d5a26" containerName="dnsmasq-dns" Jan 21 15:50:52 crc kubenswrapper[4773]: E0121 15:50:52.799828 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d70a183-ee4f-41e0-9b98-1f921165cecb" containerName="nova-manage" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.799835 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d70a183-ee4f-41e0-9b98-1f921165cecb" containerName="nova-manage" Jan 21 15:50:52 crc kubenswrapper[4773]: E0121 15:50:52.799850 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707677b6-625c-4504-bc87-efef3e08b410" containerName="nova-api-log" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.799856 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="707677b6-625c-4504-bc87-efef3e08b410" containerName="nova-api-log" Jan 21 15:50:52 crc kubenswrapper[4773]: E0121 15:50:52.799871 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="ceilometer-central-agent" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.799878 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="ceilometer-central-agent" Jan 21 15:50:52 crc kubenswrapper[4773]: E0121 15:50:52.799885 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ed7666-e174-4d86-931c-4c04712d5a26" containerName="init" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.799890 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ed7666-e174-4d86-931c-4c04712d5a26" containerName="init" Jan 21 15:50:52 crc kubenswrapper[4773]: E0121 15:50:52.799898 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="ceilometer-notification-agent" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.799903 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="ceilometer-notification-agent" Jan 21 15:50:52 crc kubenswrapper[4773]: E0121 15:50:52.799915 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707677b6-625c-4504-bc87-efef3e08b410" containerName="nova-api-api" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.799920 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="707677b6-625c-4504-bc87-efef3e08b410" containerName="nova-api-api" Jan 21 15:50:52 crc kubenswrapper[4773]: E0121 15:50:52.799934 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="sg-core" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.799939 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="sg-core" Jan 21 15:50:52 crc kubenswrapper[4773]: E0121 15:50:52.799949 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="proxy-httpd" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.799954 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="proxy-httpd" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.800138 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="proxy-httpd" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.800151 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d70a183-ee4f-41e0-9b98-1f921165cecb" containerName="nova-manage" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.800163 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="707677b6-625c-4504-bc87-efef3e08b410" containerName="nova-api-api" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.800177 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="ceilometer-notification-agent" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.800185 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="707677b6-625c-4504-bc87-efef3e08b410" containerName="nova-api-log" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.800197 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="sg-core" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.800208 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ed7666-e174-4d86-931c-4c04712d5a26" containerName="dnsmasq-dns" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.800220 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" containerName="ceilometer-central-agent" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.802354 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.806587 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.806664 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.806873 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.810604 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-fqhnk"] Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.817356 4773 scope.go:117] "RemoveContainer" containerID="22d15257331d33f255d9a137c3424a8e49e24dc9327ff48a28741f525b39bd8e" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.824506 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-fqhnk"] Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.844958 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.846893 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.850454 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.850562 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.850817 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.858884 4773 scope.go:117] "RemoveContainer" containerID="a2ff71cd26cf990c3b205f8a5497b2064ab4cfb6a384421358e6a8bedd4522fb" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.861129 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.892567 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-scripts\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.892927 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-config-data\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.893112 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.893305 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-log-httpd\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.893514 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-run-httpd\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.893618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.893894 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj8x2\" (UniqueName: \"kubernetes.io/projected/a5ca5442-5ec5-41ba-807a-d1504e326ef0-kube-api-access-lj8x2\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.893989 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.899010 4773 scope.go:117] "RemoveContainer" containerID="03690e62deccdc8e18079af67baa9cb59f69c484e1146e09eea801f207b7c6c7" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.905060 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.922555 4773 scope.go:117] "RemoveContainer" containerID="e467aab9eb9f012dc0c49a73e1deb2c634e2e38ffe3f84433aa6cb84228eefcc" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.952383 4773 scope.go:117] "RemoveContainer" containerID="5e804b1934bf64fffb517df2a3db9feb381dada3624af6f6d25e242824094239" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.996040 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-config-data\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.996328 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-run-httpd\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.996412 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.996535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj8x2\" (UniqueName: \"kubernetes.io/projected/a5ca5442-5ec5-41ba-807a-d1504e326ef0-kube-api-access-lj8x2\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.996617 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ead6527b-43a9-4f30-a682-b5e5bd25207e-logs\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.996754 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.996852 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvqgm\" (UniqueName: \"kubernetes.io/projected/ead6527b-43a9-4f30-a682-b5e5bd25207e-kube-api-access-zvqgm\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.996952 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.997068 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.997177 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-scripts\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.997818 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-config-data\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.997995 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.998178 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.996994 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-run-httpd\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.998322 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-log-httpd\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:52 crc kubenswrapper[4773]: I0121 15:50:52.998827 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-log-httpd\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.001435 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.001454 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.001606 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-scripts\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.002180 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-config-data\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.008205 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.015300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj8x2\" (UniqueName: \"kubernetes.io/projected/a5ca5442-5ec5-41ba-807a-d1504e326ef0-kube-api-access-lj8x2\") pod \"ceilometer-0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " pod="openstack/ceilometer-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.100708 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-config-data\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.100831 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ead6527b-43a9-4f30-a682-b5e5bd25207e-logs\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.100864 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvqgm\" (UniqueName: \"kubernetes.io/projected/ead6527b-43a9-4f30-a682-b5e5bd25207e-kube-api-access-zvqgm\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.100892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.100942 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.101043 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.101312 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ead6527b-43a9-4f30-a682-b5e5bd25207e-logs\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.105686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.112882 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.113001 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-config-data\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.113513 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead6527b-43a9-4f30-a682-b5e5bd25207e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.126360 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvqgm\" (UniqueName: \"kubernetes.io/projected/ead6527b-43a9-4f30-a682-b5e5bd25207e-kube-api-access-zvqgm\") pod \"nova-api-0\" (UID: \"ead6527b-43a9-4f30-a682-b5e5bd25207e\") " pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.139767 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.173430 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.417219 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707677b6-625c-4504-bc87-efef3e08b410" path="/var/lib/kubelet/pods/707677b6-625c-4504-bc87-efef3e08b410/volumes" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.420684 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ed7666-e174-4d86-931c-4c04712d5a26" path="/var/lib/kubelet/pods/99ed7666-e174-4d86-931c-4c04712d5a26/volumes" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.422510 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5be80f2-5a0f-41c0-b57d-09011f1785c0" path="/var/lib/kubelet/pods/b5be80f2-5a0f-41c0-b57d-09011f1785c0/volumes" Jan 21 15:50:53 crc kubenswrapper[4773]: E0121 15:50:53.503148 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:50:53 crc kubenswrapper[4773]: E0121 15:50:53.504527 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:50:53 crc kubenswrapper[4773]: E0121 15:50:53.506578 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 15:50:53 crc kubenswrapper[4773]: E0121 15:50:53.506648 4773 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="986374f1-375b-4346-bde4-7db28c6f1f4e" containerName="nova-scheduler-scheduler" Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.654708 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 15:50:53 crc kubenswrapper[4773]: W0121 15:50:53.656520 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podead6527b_43a9_4f30_a682_b5e5bd25207e.slice/crio-fde0d6865d2e067e4d7ac4a0c27dd04d389e3820f4c2a8b4d6005d4a1b378d30 WatchSource:0}: Error finding container fde0d6865d2e067e4d7ac4a0c27dd04d389e3820f4c2a8b4d6005d4a1b378d30: Status 404 returned error can't find the container with id fde0d6865d2e067e4d7ac4a0c27dd04d389e3820f4c2a8b4d6005d4a1b378d30 Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.694997 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ead6527b-43a9-4f30-a682-b5e5bd25207e","Type":"ContainerStarted","Data":"fde0d6865d2e067e4d7ac4a0c27dd04d389e3820f4c2a8b4d6005d4a1b378d30"} Jan 21 15:50:53 crc kubenswrapper[4773]: I0121 15:50:53.791060 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:50:53 crc kubenswrapper[4773]: W0121 15:50:53.792903 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5ca5442_5ec5_41ba_807a_d1504e326ef0.slice/crio-16ef9f845d21c292f1e42a5acf70a427214beff2603c008d46a79b0ccb9f13e6 WatchSource:0}: Error finding container 16ef9f845d21c292f1e42a5acf70a427214beff2603c008d46a79b0ccb9f13e6: Status 404 returned error can't find the container with id 16ef9f845d21c292f1e42a5acf70a427214beff2603c008d46a79b0ccb9f13e6 Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.623963 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.636377 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.727805 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ead6527b-43a9-4f30-a682-b5e5bd25207e","Type":"ContainerStarted","Data":"33713063812bf17aef245d54fdbe72c5793686c22c909cb244cb4185752b1131"} Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.727844 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ead6527b-43a9-4f30-a682-b5e5bd25207e","Type":"ContainerStarted","Data":"d25a96155c37a2e1ef9a1ddd72112af44d3dcfe194e40b25caf70cef241c3386"} Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.730609 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5ca5442-5ec5-41ba-807a-d1504e326ef0","Type":"ContainerStarted","Data":"16ef9f845d21c292f1e42a5acf70a427214beff2603c008d46a79b0ccb9f13e6"} Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.733202 4773 generic.go:334] "Generic (PLEG): container finished" podID="986374f1-375b-4346-bde4-7db28c6f1f4e" containerID="5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f" exitCode=0 Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.733249 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"986374f1-375b-4346-bde4-7db28c6f1f4e","Type":"ContainerDied","Data":"5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f"} Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.733271 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"986374f1-375b-4346-bde4-7db28c6f1f4e","Type":"ContainerDied","Data":"82ffcd42a446afc4c1f0c889a0cd3188c129683d0509b8ca946f9a2d197bdd3a"} Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.733287 4773 scope.go:117] "RemoveContainer" containerID="5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.733375 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.738532 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-combined-ca-bundle\") pod \"986374f1-375b-4346-bde4-7db28c6f1f4e\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.738596 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-logs\") pod \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.738638 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-config-data\") pod \"986374f1-375b-4346-bde4-7db28c6f1f4e\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.738691 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-nova-metadata-tls-certs\") pod \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.738752 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-config-data\") pod \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.738913 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-combined-ca-bundle\") pod \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.738950 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g957t\" (UniqueName: \"kubernetes.io/projected/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-kube-api-access-g957t\") pod \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\" (UID: \"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8\") " Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.739001 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmxz6\" (UniqueName: \"kubernetes.io/projected/986374f1-375b-4346-bde4-7db28c6f1f4e-kube-api-access-wmxz6\") pod \"986374f1-375b-4346-bde4-7db28c6f1f4e\" (UID: \"986374f1-375b-4346-bde4-7db28c6f1f4e\") " Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.740505 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-logs" (OuterVolumeSpecName: "logs") pod "c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" (UID: "c3fd0f7b-62b6-4686-8ae0-91d3abd557e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.756916 4773 generic.go:334] "Generic (PLEG): container finished" podID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerID="ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0" exitCode=0 Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.756959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8","Type":"ContainerDied","Data":"ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0"} Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.756985 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3fd0f7b-62b6-4686-8ae0-91d3abd557e8","Type":"ContainerDied","Data":"83c1702257a63f2cb3116c3ece74b722d357d0745472a2b73558185b7831f696"} Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.757045 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.757731 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-kube-api-access-g957t" (OuterVolumeSpecName: "kube-api-access-g957t") pod "c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" (UID: "c3fd0f7b-62b6-4686-8ae0-91d3abd557e8"). InnerVolumeSpecName "kube-api-access-g957t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.766102 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.76607755 podStartE2EDuration="2.76607755s" podCreationTimestamp="2026-01-21 15:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:54.747095112 +0000 UTC m=+1619.671584724" watchObservedRunningTime="2026-01-21 15:50:54.76607755 +0000 UTC m=+1619.690567172" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.774471 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986374f1-375b-4346-bde4-7db28c6f1f4e-kube-api-access-wmxz6" (OuterVolumeSpecName: "kube-api-access-wmxz6") pod "986374f1-375b-4346-bde4-7db28c6f1f4e" (UID: "986374f1-375b-4346-bde4-7db28c6f1f4e"). InnerVolumeSpecName "kube-api-access-wmxz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.786730 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-config-data" (OuterVolumeSpecName: "config-data") pod "986374f1-375b-4346-bde4-7db28c6f1f4e" (UID: "986374f1-375b-4346-bde4-7db28c6f1f4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.792486 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" (UID: "c3fd0f7b-62b6-4686-8ae0-91d3abd557e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.798340 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-config-data" (OuterVolumeSpecName: "config-data") pod "c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" (UID: "c3fd0f7b-62b6-4686-8ae0-91d3abd557e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.803405 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "986374f1-375b-4346-bde4-7db28c6f1f4e" (UID: "986374f1-375b-4346-bde4-7db28c6f1f4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.815779 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" (UID: "c3fd0f7b-62b6-4686-8ae0-91d3abd557e8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.830112 4773 scope.go:117] "RemoveContainer" containerID="5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f" Jan 21 15:50:54 crc kubenswrapper[4773]: E0121 15:50:54.830610 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f\": container with ID starting with 5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f not found: ID does not exist" containerID="5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.830664 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f"} err="failed to get container status \"5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f\": rpc error: code = NotFound desc = could not find container \"5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f\": container with ID starting with 5044d6fe4fd111daa402d4ce73d3180d30cf1c0a7b58b1090cb048805ccb0c0f not found: ID does not exist" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.830711 4773 scope.go:117] "RemoveContainer" containerID="ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.841944 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.841978 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g957t\" (UniqueName: \"kubernetes.io/projected/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-kube-api-access-g957t\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.841989 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmxz6\" (UniqueName: \"kubernetes.io/projected/986374f1-375b-4346-bde4-7db28c6f1f4e-kube-api-access-wmxz6\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.841999 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.842011 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.842051 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/986374f1-375b-4346-bde4-7db28c6f1f4e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.842061 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.842071 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.855895 4773 scope.go:117] "RemoveContainer" containerID="d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.879525 4773 scope.go:117] "RemoveContainer" containerID="ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0" Jan 21 15:50:54 crc kubenswrapper[4773]: E0121 15:50:54.879962 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0\": container with ID starting with ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0 not found: ID does not exist" containerID="ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.880005 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0"} err="failed to get container status \"ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0\": rpc error: code = NotFound desc = could not find container \"ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0\": container with ID starting with ffbeb35c02bf016d2b448fa1b7776ee259f065e911646afac172a396d562a1f0 not found: ID does not exist" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.880055 4773 scope.go:117] "RemoveContainer" containerID="d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e" Jan 21 15:50:54 crc kubenswrapper[4773]: E0121 15:50:54.880368 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e\": container with ID starting with d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e not found: ID does not exist" containerID="d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e" Jan 21 15:50:54 crc kubenswrapper[4773]: I0121 15:50:54.880398 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e"} err="failed to get container status \"d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e\": rpc error: code = NotFound desc = could not find container \"d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e\": container with ID starting with d5c0858bab8b3084988ae406cb66bad9016fb11b8898790dacc616959f348c9e not found: ID does not exist" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.094129 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.107329 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.140758 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:55 crc kubenswrapper[4773]: E0121 15:50:55.141312 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerName="nova-metadata-metadata" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.141387 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerName="nova-metadata-metadata" Jan 21 15:50:55 crc kubenswrapper[4773]: E0121 15:50:55.141451 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerName="nova-metadata-log" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.141498 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerName="nova-metadata-log" Jan 21 15:50:55 crc kubenswrapper[4773]: E0121 15:50:55.141561 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986374f1-375b-4346-bde4-7db28c6f1f4e" containerName="nova-scheduler-scheduler" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.141631 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="986374f1-375b-4346-bde4-7db28c6f1f4e" containerName="nova-scheduler-scheduler" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.141890 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="986374f1-375b-4346-bde4-7db28c6f1f4e" containerName="nova-scheduler-scheduler" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.141966 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerName="nova-metadata-log" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.148465 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" containerName="nova-metadata-metadata" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.149570 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.154755 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.183935 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.208684 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.211839 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.211889 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.211931 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.243498 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.243648 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" gracePeriod=600 Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.252658 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z486g\" (UniqueName: \"kubernetes.io/projected/9dbed76e-56cf-4a05-b29a-0e2bc8454441-kube-api-access-z486g\") pod \"nova-scheduler-0\" (UID: \"9dbed76e-56cf-4a05-b29a-0e2bc8454441\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.252932 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dbed76e-56cf-4a05-b29a-0e2bc8454441-config-data\") pod \"nova-scheduler-0\" (UID: \"9dbed76e-56cf-4a05-b29a-0e2bc8454441\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.253008 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dbed76e-56cf-4a05-b29a-0e2bc8454441-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9dbed76e-56cf-4a05-b29a-0e2bc8454441\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.257915 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:55 crc kubenswrapper[4773]: E0121 15:50:55.284600 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod986374f1_375b_4346_bde4_7db28c6f1f4e.slice\": RecentStats: unable to find data in memory cache]" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.287853 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.289789 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.291552 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.292010 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.297721 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.355498 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-config-data\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.355592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.355617 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z486g\" (UniqueName: \"kubernetes.io/projected/9dbed76e-56cf-4a05-b29a-0e2bc8454441-kube-api-access-z486g\") pod \"nova-scheduler-0\" (UID: \"9dbed76e-56cf-4a05-b29a-0e2bc8454441\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.355654 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dfwp\" (UniqueName: \"kubernetes.io/projected/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-kube-api-access-8dfwp\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.355716 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dbed76e-56cf-4a05-b29a-0e2bc8454441-config-data\") pod \"nova-scheduler-0\" (UID: \"9dbed76e-56cf-4a05-b29a-0e2bc8454441\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.355737 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dbed76e-56cf-4a05-b29a-0e2bc8454441-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9dbed76e-56cf-4a05-b29a-0e2bc8454441\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.355757 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-logs\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.355862 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.361214 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dbed76e-56cf-4a05-b29a-0e2bc8454441-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9dbed76e-56cf-4a05-b29a-0e2bc8454441\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.361908 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.371432 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dbed76e-56cf-4a05-b29a-0e2bc8454441-config-data\") pod \"nova-scheduler-0\" (UID: \"9dbed76e-56cf-4a05-b29a-0e2bc8454441\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.376439 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z486g\" (UniqueName: \"kubernetes.io/projected/9dbed76e-56cf-4a05-b29a-0e2bc8454441-kube-api-access-z486g\") pod \"nova-scheduler-0\" (UID: \"9dbed76e-56cf-4a05-b29a-0e2bc8454441\") " pod="openstack/nova-scheduler-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.399673 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="986374f1-375b-4346-bde4-7db28c6f1f4e" path="/var/lib/kubelet/pods/986374f1-375b-4346-bde4-7db28c6f1f4e/volumes" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.400355 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3fd0f7b-62b6-4686-8ae0-91d3abd557e8" path="/var/lib/kubelet/pods/c3fd0f7b-62b6-4686-8ae0-91d3abd557e8/volumes" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.457165 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-logs\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.457537 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.457582 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-config-data\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.457644 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.457677 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dfwp\" (UniqueName: \"kubernetes.io/projected/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-kube-api-access-8dfwp\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.457717 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-logs\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.459956 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.460189 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.461294 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.472447 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.472572 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-config-data\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.475443 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dfwp\" (UniqueName: \"kubernetes.io/projected/7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a-kube-api-access-8dfwp\") pod \"nova-metadata-0\" (UID: \"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a\") " pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.477512 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.613486 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 15:50:55 crc kubenswrapper[4773]: E0121 15:50:55.880820 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:50:55 crc kubenswrapper[4773]: I0121 15:50:55.921938 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 15:50:56 crc kubenswrapper[4773]: I0121 15:50:56.081868 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 15:50:56 crc kubenswrapper[4773]: W0121 15:50:56.082618 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fa4e0c1_4ca6_40a8_8eb4_6d8a5193762a.slice/crio-a39653f282b44085c29804ca09ec1316497e5ee647c16212c222303d6f9692f2 WatchSource:0}: Error finding container a39653f282b44085c29804ca09ec1316497e5ee647c16212c222303d6f9692f2: Status 404 returned error can't find the container with id a39653f282b44085c29804ca09ec1316497e5ee647c16212c222303d6f9692f2 Jan 21 15:50:56 crc kubenswrapper[4773]: I0121 15:50:56.785488 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5ca5442-5ec5-41ba-807a-d1504e326ef0","Type":"ContainerStarted","Data":"33db7d189ab7a6f185952a70dc00cead16d4d16f8be4167a5f2e2bb3ea375b36"} Jan 21 15:50:56 crc kubenswrapper[4773]: I0121 15:50:56.788179 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" exitCode=0 Jan 21 15:50:56 crc kubenswrapper[4773]: I0121 15:50:56.788235 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2"} Jan 21 15:50:56 crc kubenswrapper[4773]: I0121 15:50:56.788258 4773 scope.go:117] "RemoveContainer" containerID="056b8b391d1fca843084a7f1dcee0b88446478caa5b2f33055adc27b73ac99d3" Jan 21 15:50:56 crc kubenswrapper[4773]: I0121 15:50:56.789184 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:50:56 crc kubenswrapper[4773]: I0121 15:50:56.790976 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a","Type":"ContainerStarted","Data":"6afbd0c68acf0cc7412aab95a7933cafb2ca905360399b6ccb9e5c0769fc7864"} Jan 21 15:50:56 crc kubenswrapper[4773]: I0121 15:50:56.791584 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a","Type":"ContainerStarted","Data":"a39653f282b44085c29804ca09ec1316497e5ee647c16212c222303d6f9692f2"} Jan 21 15:50:56 crc kubenswrapper[4773]: E0121 15:50:56.798722 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:50:56 crc kubenswrapper[4773]: I0121 15:50:56.802074 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9dbed76e-56cf-4a05-b29a-0e2bc8454441","Type":"ContainerStarted","Data":"a2bc1d055e5104c72c5a331e30f0c1cecb724383c7d9198e1c812be552b2c883"} Jan 21 15:50:56 crc kubenswrapper[4773]: I0121 15:50:56.802393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9dbed76e-56cf-4a05-b29a-0e2bc8454441","Type":"ContainerStarted","Data":"234cf14d2b5a2fdc3bd98180c057998f9a80ebaf3e5411555ac18a131e6bab6f"} Jan 21 15:50:56 crc kubenswrapper[4773]: I0121 15:50:56.834160 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.8341389540000002 podStartE2EDuration="1.834138954s" podCreationTimestamp="2026-01-21 15:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:56.830239378 +0000 UTC m=+1621.754729000" watchObservedRunningTime="2026-01-21 15:50:56.834138954 +0000 UTC m=+1621.758628576" Jan 21 15:50:57 crc kubenswrapper[4773]: I0121 15:50:57.835792 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5ca5442-5ec5-41ba-807a-d1504e326ef0","Type":"ContainerStarted","Data":"233a5e489e27703c8c2c5992f35c4b29f5396787148826fb9b9762bbe4ce259f"} Jan 21 15:50:57 crc kubenswrapper[4773]: I0121 15:50:57.842747 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a","Type":"ContainerStarted","Data":"8ce1598da480dbd2d4bb02a10fa0af7ce74ece64af498e93f4b9a9cc05fc10a6"} Jan 21 15:50:57 crc kubenswrapper[4773]: I0121 15:50:57.868743 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.868721637 podStartE2EDuration="2.868721637s" podCreationTimestamp="2026-01-21 15:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:50:57.860859742 +0000 UTC m=+1622.785349374" watchObservedRunningTime="2026-01-21 15:50:57.868721637 +0000 UTC m=+1622.793211259" Jan 21 15:50:58 crc kubenswrapper[4773]: I0121 15:50:58.856383 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5ca5442-5ec5-41ba-807a-d1504e326ef0","Type":"ContainerStarted","Data":"21464dc91e6188d401e22868386e98d507ac661bcc82141ec3a76b8aaf6ae567"} Jan 21 15:51:00 crc kubenswrapper[4773]: I0121 15:51:00.478035 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 15:51:00 crc kubenswrapper[4773]: I0121 15:51:00.613603 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 15:51:00 crc kubenswrapper[4773]: I0121 15:51:00.614514 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 15:51:00 crc kubenswrapper[4773]: I0121 15:51:00.879818 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5ca5442-5ec5-41ba-807a-d1504e326ef0","Type":"ContainerStarted","Data":"a70a9fc885aba7b23581e22532d919826cce4706cc27d1942c67911b4a9178ef"} Jan 21 15:51:00 crc kubenswrapper[4773]: I0121 15:51:00.880421 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 15:51:03 crc kubenswrapper[4773]: I0121 15:51:03.174755 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 15:51:03 crc kubenswrapper[4773]: I0121 15:51:03.176654 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 15:51:04 crc kubenswrapper[4773]: I0121 15:51:04.221981 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ead6527b-43a9-4f30-a682-b5e5bd25207e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:51:04 crc kubenswrapper[4773]: I0121 15:51:04.222062 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ead6527b-43a9-4f30-a682-b5e5bd25207e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:51:05 crc kubenswrapper[4773]: I0121 15:51:05.477781 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 21 15:51:05 crc kubenswrapper[4773]: I0121 15:51:05.513022 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 21 15:51:05 crc kubenswrapper[4773]: I0121 15:51:05.560556 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.767864804 podStartE2EDuration="13.560519639s" podCreationTimestamp="2026-01-21 15:50:52 +0000 UTC" firstStartedPulling="2026-01-21 15:50:53.795419102 +0000 UTC m=+1618.719908724" lastFinishedPulling="2026-01-21 15:51:00.588073937 +0000 UTC m=+1625.512563559" observedRunningTime="2026-01-21 15:51:00.907319555 +0000 UTC m=+1625.831809177" watchObservedRunningTime="2026-01-21 15:51:05.560519639 +0000 UTC m=+1630.485009271" Jan 21 15:51:05 crc kubenswrapper[4773]: I0121 15:51:05.614573 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 15:51:05 crc kubenswrapper[4773]: I0121 15:51:05.615176 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 15:51:05 crc kubenswrapper[4773]: I0121 15:51:05.975405 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 21 15:51:06 crc kubenswrapper[4773]: I0121 15:51:06.664970 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.237:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:51:06 crc kubenswrapper[4773]: I0121 15:51:06.664970 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.237:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 15:51:10 crc kubenswrapper[4773]: I0121 15:51:10.383893 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:51:10 crc kubenswrapper[4773]: E0121 15:51:10.384839 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:51:13 crc kubenswrapper[4773]: I0121 15:51:13.181048 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 15:51:13 crc kubenswrapper[4773]: I0121 15:51:13.181667 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 15:51:13 crc kubenswrapper[4773]: I0121 15:51:13.188721 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 15:51:13 crc kubenswrapper[4773]: I0121 15:51:13.188782 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 15:51:14 crc kubenswrapper[4773]: I0121 15:51:14.022060 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 15:51:14 crc kubenswrapper[4773]: I0121 15:51:14.029558 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 15:51:15 crc kubenswrapper[4773]: I0121 15:51:15.619589 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 15:51:15 crc kubenswrapper[4773]: I0121 15:51:15.620001 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 15:51:15 crc kubenswrapper[4773]: I0121 15:51:15.624644 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 15:51:15 crc kubenswrapper[4773]: I0121 15:51:15.630128 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 15:51:23 crc kubenswrapper[4773]: I0121 15:51:23.154069 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 21 15:51:25 crc kubenswrapper[4773]: I0121 15:51:25.414716 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:51:25 crc kubenswrapper[4773]: E0121 15:51:25.416109 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:51:34 crc kubenswrapper[4773]: I0121 15:51:34.820309 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-77kxr"] Jan 21 15:51:34 crc kubenswrapper[4773]: I0121 15:51:34.830181 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-77kxr"] Jan 21 15:51:34 crc kubenswrapper[4773]: I0121 15:51:34.917522 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-j7nrq"] Jan 21 15:51:34 crc kubenswrapper[4773]: I0121 15:51:34.919074 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:34 crc kubenswrapper[4773]: I0121 15:51:34.922243 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 21 15:51:34 crc kubenswrapper[4773]: I0121 15:51:34.928825 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-j7nrq"] Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.072249 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-config-data\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.072325 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-certs\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.072406 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-scripts\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.072444 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-combined-ca-bundle\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.072495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9jzr\" (UniqueName: \"kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-kube-api-access-r9jzr\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.174322 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-config-data\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.175324 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-certs\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.175458 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-scripts\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.175524 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-combined-ca-bundle\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.175592 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9jzr\" (UniqueName: \"kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-kube-api-access-r9jzr\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.181527 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-combined-ca-bundle\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.181584 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-certs\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.183533 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-config-data\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.188226 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-scripts\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.193596 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9jzr\" (UniqueName: \"kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-kube-api-access-r9jzr\") pod \"cloudkitty-db-sync-j7nrq\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.306979 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.405612 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6a84b1-1846-4dd0-be7f-47a8872227ff" path="/var/lib/kubelet/pods/de6a84b1-1846-4dd0-be7f-47a8872227ff/volumes" Jan 21 15:51:35 crc kubenswrapper[4773]: I0121 15:51:35.856441 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-j7nrq"] Jan 21 15:51:36 crc kubenswrapper[4773]: I0121 15:51:36.249172 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-j7nrq" event={"ID":"e6481770-7fe3-45bf-8e7b-18ca325f1a6d","Type":"ContainerStarted","Data":"7120214c083d27e69c5b06783d00812db4d1905e4bce5f5595ec8d9e28a91242"} Jan 21 15:51:36 crc kubenswrapper[4773]: I0121 15:51:36.713451 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:51:36 crc kubenswrapper[4773]: I0121 15:51:36.713998 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="ceilometer-central-agent" containerID="cri-o://33db7d189ab7a6f185952a70dc00cead16d4d16f8be4167a5f2e2bb3ea375b36" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4773]: I0121 15:51:36.714053 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="sg-core" containerID="cri-o://21464dc91e6188d401e22868386e98d507ac661bcc82141ec3a76b8aaf6ae567" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4773]: I0121 15:51:36.714074 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="ceilometer-notification-agent" containerID="cri-o://233a5e489e27703c8c2c5992f35c4b29f5396787148826fb9b9762bbe4ce259f" gracePeriod=30 Jan 21 15:51:36 crc kubenswrapper[4773]: I0121 15:51:36.714093 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="proxy-httpd" containerID="cri-o://a70a9fc885aba7b23581e22532d919826cce4706cc27d1942c67911b4a9178ef" gracePeriod=30 Jan 21 15:51:37 crc kubenswrapper[4773]: I0121 15:51:37.125384 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 15:51:37 crc kubenswrapper[4773]: I0121 15:51:37.262902 4773 generic.go:334] "Generic (PLEG): container finished" podID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerID="a70a9fc885aba7b23581e22532d919826cce4706cc27d1942c67911b4a9178ef" exitCode=0 Jan 21 15:51:37 crc kubenswrapper[4773]: I0121 15:51:37.262940 4773 generic.go:334] "Generic (PLEG): container finished" podID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerID="21464dc91e6188d401e22868386e98d507ac661bcc82141ec3a76b8aaf6ae567" exitCode=2 Jan 21 15:51:37 crc kubenswrapper[4773]: I0121 15:51:37.262978 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5ca5442-5ec5-41ba-807a-d1504e326ef0","Type":"ContainerDied","Data":"a70a9fc885aba7b23581e22532d919826cce4706cc27d1942c67911b4a9178ef"} Jan 21 15:51:37 crc kubenswrapper[4773]: I0121 15:51:37.263021 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5ca5442-5ec5-41ba-807a-d1504e326ef0","Type":"ContainerDied","Data":"21464dc91e6188d401e22868386e98d507ac661bcc82141ec3a76b8aaf6ae567"} Jan 21 15:51:37 crc kubenswrapper[4773]: I0121 15:51:37.264645 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-j7nrq" event={"ID":"e6481770-7fe3-45bf-8e7b-18ca325f1a6d","Type":"ContainerStarted","Data":"6b87c48c14a097a1f64db9734b0f9800d89972e9e28e2c15b6b141c87c13262a"} Jan 21 15:51:37 crc kubenswrapper[4773]: I0121 15:51:37.287011 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-j7nrq" podStartSLOduration=2.695281669 podStartE2EDuration="3.28698573s" podCreationTimestamp="2026-01-21 15:51:34 +0000 UTC" firstStartedPulling="2026-01-21 15:51:35.863062577 +0000 UTC m=+1660.787552199" lastFinishedPulling="2026-01-21 15:51:36.454766638 +0000 UTC m=+1661.379256260" observedRunningTime="2026-01-21 15:51:37.286444865 +0000 UTC m=+1662.210934507" watchObservedRunningTime="2026-01-21 15:51:37.28698573 +0000 UTC m=+1662.211475352" Jan 21 15:51:38 crc kubenswrapper[4773]: I0121 15:51:38.252967 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 15:51:38 crc kubenswrapper[4773]: I0121 15:51:38.280981 4773 generic.go:334] "Generic (PLEG): container finished" podID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerID="33db7d189ab7a6f185952a70dc00cead16d4d16f8be4167a5f2e2bb3ea375b36" exitCode=0 Jan 21 15:51:38 crc kubenswrapper[4773]: I0121 15:51:38.281062 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5ca5442-5ec5-41ba-807a-d1504e326ef0","Type":"ContainerDied","Data":"33db7d189ab7a6f185952a70dc00cead16d4d16f8be4167a5f2e2bb3ea375b36"} Jan 21 15:51:38 crc kubenswrapper[4773]: I0121 15:51:38.384116 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:51:38 crc kubenswrapper[4773]: E0121 15:51:38.384449 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:51:39 crc kubenswrapper[4773]: I0121 15:51:39.292834 4773 generic.go:334] "Generic (PLEG): container finished" podID="e6481770-7fe3-45bf-8e7b-18ca325f1a6d" containerID="6b87c48c14a097a1f64db9734b0f9800d89972e9e28e2c15b6b141c87c13262a" exitCode=0 Jan 21 15:51:39 crc kubenswrapper[4773]: I0121 15:51:39.292927 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-j7nrq" event={"ID":"e6481770-7fe3-45bf-8e7b-18ca325f1a6d","Type":"ContainerDied","Data":"6b87c48c14a097a1f64db9734b0f9800d89972e9e28e2c15b6b141c87c13262a"} Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.319214 4773 generic.go:334] "Generic (PLEG): container finished" podID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerID="233a5e489e27703c8c2c5992f35c4b29f5396787148826fb9b9762bbe4ce259f" exitCode=0 Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.319279 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5ca5442-5ec5-41ba-807a-d1504e326ef0","Type":"ContainerDied","Data":"233a5e489e27703c8c2c5992f35c4b29f5396787148826fb9b9762bbe4ce259f"} Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.785404 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.907234 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-log-httpd\") pod \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.907615 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj8x2\" (UniqueName: \"kubernetes.io/projected/a5ca5442-5ec5-41ba-807a-d1504e326ef0-kube-api-access-lj8x2\") pod \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.907906 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-ceilometer-tls-certs\") pod \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.908036 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-config-data\") pod \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.908351 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-run-httpd\") pod \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.908455 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-sg-core-conf-yaml\") pod \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.908559 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-combined-ca-bundle\") pod \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.908734 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-scripts\") pod \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\" (UID: \"a5ca5442-5ec5-41ba-807a-d1504e326ef0\") " Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.908810 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a5ca5442-5ec5-41ba-807a-d1504e326ef0" (UID: "a5ca5442-5ec5-41ba-807a-d1504e326ef0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.908905 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a5ca5442-5ec5-41ba-807a-d1504e326ef0" (UID: "a5ca5442-5ec5-41ba-807a-d1504e326ef0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.909996 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.910114 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5ca5442-5ec5-41ba-807a-d1504e326ef0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.916208 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ca5442-5ec5-41ba-807a-d1504e326ef0-kube-api-access-lj8x2" (OuterVolumeSpecName: "kube-api-access-lj8x2") pod "a5ca5442-5ec5-41ba-807a-d1504e326ef0" (UID: "a5ca5442-5ec5-41ba-807a-d1504e326ef0"). InnerVolumeSpecName "kube-api-access-lj8x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.921341 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-scripts" (OuterVolumeSpecName: "scripts") pod "a5ca5442-5ec5-41ba-807a-d1504e326ef0" (UID: "a5ca5442-5ec5-41ba-807a-d1504e326ef0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.922943 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:40 crc kubenswrapper[4773]: I0121 15:51:40.942508 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a5ca5442-5ec5-41ba-807a-d1504e326ef0" (UID: "a5ca5442-5ec5-41ba-807a-d1504e326ef0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.005275 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a5ca5442-5ec5-41ba-807a-d1504e326ef0" (UID: "a5ca5442-5ec5-41ba-807a-d1504e326ef0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.018584 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-certs\") pod \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.018810 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-scripts\") pod \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.018857 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9jzr\" (UniqueName: \"kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-kube-api-access-r9jzr\") pod \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.018978 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-config-data\") pod \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.019004 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-combined-ca-bundle\") pod \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\" (UID: \"e6481770-7fe3-45bf-8e7b-18ca325f1a6d\") " Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.019464 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.019480 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.019489 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj8x2\" (UniqueName: \"kubernetes.io/projected/a5ca5442-5ec5-41ba-807a-d1504e326ef0-kube-api-access-lj8x2\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.019500 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.025954 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-certs" (OuterVolumeSpecName: "certs") pod "e6481770-7fe3-45bf-8e7b-18ca325f1a6d" (UID: "e6481770-7fe3-45bf-8e7b-18ca325f1a6d"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.030260 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-scripts" (OuterVolumeSpecName: "scripts") pod "e6481770-7fe3-45bf-8e7b-18ca325f1a6d" (UID: "e6481770-7fe3-45bf-8e7b-18ca325f1a6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.035906 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-kube-api-access-r9jzr" (OuterVolumeSpecName: "kube-api-access-r9jzr") pod "e6481770-7fe3-45bf-8e7b-18ca325f1a6d" (UID: "e6481770-7fe3-45bf-8e7b-18ca325f1a6d"). InnerVolumeSpecName "kube-api-access-r9jzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.036213 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5ca5442-5ec5-41ba-807a-d1504e326ef0" (UID: "a5ca5442-5ec5-41ba-807a-d1504e326ef0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.054943 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-config-data" (OuterVolumeSpecName: "config-data") pod "a5ca5442-5ec5-41ba-807a-d1504e326ef0" (UID: "a5ca5442-5ec5-41ba-807a-d1504e326ef0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.057824 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6481770-7fe3-45bf-8e7b-18ca325f1a6d" (UID: "e6481770-7fe3-45bf-8e7b-18ca325f1a6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.063936 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-config-data" (OuterVolumeSpecName: "config-data") pod "e6481770-7fe3-45bf-8e7b-18ca325f1a6d" (UID: "e6481770-7fe3-45bf-8e7b-18ca325f1a6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.121419 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.121473 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9jzr\" (UniqueName: \"kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-kube-api-access-r9jzr\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.121492 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.121505 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.121517 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.121529 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ca5442-5ec5-41ba-807a-d1504e326ef0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.121543 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e6481770-7fe3-45bf-8e7b-18ca325f1a6d-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.333031 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5ca5442-5ec5-41ba-807a-d1504e326ef0","Type":"ContainerDied","Data":"16ef9f845d21c292f1e42a5acf70a427214beff2603c008d46a79b0ccb9f13e6"} Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.333071 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.333082 4773 scope.go:117] "RemoveContainer" containerID="a70a9fc885aba7b23581e22532d919826cce4706cc27d1942c67911b4a9178ef" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.337606 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-j7nrq" event={"ID":"e6481770-7fe3-45bf-8e7b-18ca325f1a6d","Type":"ContainerDied","Data":"7120214c083d27e69c5b06783d00812db4d1905e4bce5f5595ec8d9e28a91242"} Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.337652 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7120214c083d27e69c5b06783d00812db4d1905e4bce5f5595ec8d9e28a91242" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.337675 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-j7nrq" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.367810 4773 scope.go:117] "RemoveContainer" containerID="21464dc91e6188d401e22868386e98d507ac661bcc82141ec3a76b8aaf6ae567" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.383004 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.409793 4773 scope.go:117] "RemoveContainer" containerID="233a5e489e27703c8c2c5992f35c4b29f5396787148826fb9b9762bbe4ce259f" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.414951 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.432578 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:51:41 crc kubenswrapper[4773]: E0121 15:51:41.433025 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="sg-core" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.433045 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="sg-core" Jan 21 15:51:41 crc kubenswrapper[4773]: E0121 15:51:41.433069 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="ceilometer-notification-agent" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.433078 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="ceilometer-notification-agent" Jan 21 15:51:41 crc kubenswrapper[4773]: E0121 15:51:41.433094 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6481770-7fe3-45bf-8e7b-18ca325f1a6d" containerName="cloudkitty-db-sync" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.433103 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6481770-7fe3-45bf-8e7b-18ca325f1a6d" containerName="cloudkitty-db-sync" Jan 21 15:51:41 crc kubenswrapper[4773]: E0121 15:51:41.433123 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="proxy-httpd" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.433130 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="proxy-httpd" Jan 21 15:51:41 crc kubenswrapper[4773]: E0121 15:51:41.433139 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="ceilometer-central-agent" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.433145 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="ceilometer-central-agent" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.433325 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="ceilometer-central-agent" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.433340 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6481770-7fe3-45bf-8e7b-18ca325f1a6d" containerName="cloudkitty-db-sync" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.433351 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="proxy-httpd" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.433366 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="ceilometer-notification-agent" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.433377 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" containerName="sg-core" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.435319 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.438207 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.438329 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.440655 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.457616 4773 scope.go:117] "RemoveContainer" containerID="33db7d189ab7a6f185952a70dc00cead16d4d16f8be4167a5f2e2bb3ea375b36" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.467614 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.487433 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-9pxhf"] Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.500507 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-9pxhf"] Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.530880 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.530956 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-scripts\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.531004 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.531077 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-log-httpd\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.531162 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.531190 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km4wx\" (UniqueName: \"kubernetes.io/projected/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-kube-api-access-km4wx\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.531278 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-config-data\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.531372 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-run-httpd\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.559289 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-5vvkf"] Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.561004 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.566169 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.573476 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-5vvkf"] Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.634044 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.634121 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-scripts\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.634154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.634201 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-log-httpd\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.634249 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.634271 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km4wx\" (UniqueName: \"kubernetes.io/projected/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-kube-api-access-km4wx\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.634318 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-config-data\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.634387 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-run-httpd\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.634969 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-log-httpd\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.635016 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-run-httpd\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.641045 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-config-data\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.641155 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.641547 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.655036 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.655901 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-scripts\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.656058 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km4wx\" (UniqueName: \"kubernetes.io/projected/61b8d46c-ed1d-4e8c-9d65-4c901fc300e4-kube-api-access-km4wx\") pod \"ceilometer-0\" (UID: \"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4\") " pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.736780 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-config-data\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.736873 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v97rv\" (UniqueName: \"kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-kube-api-access-v97rv\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.736937 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-scripts\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.736956 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-certs\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.736996 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-combined-ca-bundle\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.763676 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.838900 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-scripts\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.838958 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-certs\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.839009 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-combined-ca-bundle\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.839162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-config-data\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.839254 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v97rv\" (UniqueName: \"kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-kube-api-access-v97rv\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.844530 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-combined-ca-bundle\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.844894 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-scripts\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.846375 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-certs\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.848485 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-config-data\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.860446 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v97rv\" (UniqueName: \"kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-kube-api-access-v97rv\") pod \"cloudkitty-storageinit-5vvkf\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:41 crc kubenswrapper[4773]: I0121 15:51:41.897386 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:42 crc kubenswrapper[4773]: I0121 15:51:42.364275 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 15:51:42 crc kubenswrapper[4773]: I0121 15:51:42.410432 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" containerName="rabbitmq" containerID="cri-o://759d30338b996eef460469dd5376f15317f00b658a5853172f67b625104e5cee" gracePeriod=604795 Jan 21 15:51:42 crc kubenswrapper[4773]: W0121 15:51:42.508120 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15296d45_0901_4f95_b397_841dc24e8a08.slice/crio-78de6156d66e4928f7cb130bb7625b99a1409fb1c1b35855c7e7ffb3e3e6399b WatchSource:0}: Error finding container 78de6156d66e4928f7cb130bb7625b99a1409fb1c1b35855c7e7ffb3e3e6399b: Status 404 returned error can't find the container with id 78de6156d66e4928f7cb130bb7625b99a1409fb1c1b35855c7e7ffb3e3e6399b Jan 21 15:51:42 crc kubenswrapper[4773]: I0121 15:51:42.509931 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-5vvkf"] Jan 21 15:51:43 crc kubenswrapper[4773]: I0121 15:51:43.364870 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4","Type":"ContainerStarted","Data":"a89e5b0016560107082a082161a9bd41fc01214ed959ca5bdb99acd0b63cf3ee"} Jan 21 15:51:43 crc kubenswrapper[4773]: I0121 15:51:43.368833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-5vvkf" event={"ID":"15296d45-0901-4f95-b397-841dc24e8a08","Type":"ContainerStarted","Data":"78de6156d66e4928f7cb130bb7625b99a1409fb1c1b35855c7e7ffb3e3e6399b"} Jan 21 15:51:43 crc kubenswrapper[4773]: I0121 15:51:43.396316 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ca5442-5ec5-41ba-807a-d1504e326ef0" path="/var/lib/kubelet/pods/a5ca5442-5ec5-41ba-807a-d1504e326ef0/volumes" Jan 21 15:51:43 crc kubenswrapper[4773]: I0121 15:51:43.397947 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd3ea21-489e-4693-8c06-1ec6af224609" path="/var/lib/kubelet/pods/acd3ea21-489e-4693-8c06-1ec6af224609/volumes" Jan 21 15:51:43 crc kubenswrapper[4773]: I0121 15:51:43.663316 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1849053d-528d-42bf-93f3-31cb3ef1c91e" containerName="rabbitmq" containerID="cri-o://b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01" gracePeriod=604795 Jan 21 15:51:44 crc kubenswrapper[4773]: I0121 15:51:44.380369 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-5vvkf" event={"ID":"15296d45-0901-4f95-b397-841dc24e8a08","Type":"ContainerStarted","Data":"4c1b9e99d4d1a77f6f4c818545f52e8f4f4a18fde8a2b8c17f5988d7d732244d"} Jan 21 15:51:44 crc kubenswrapper[4773]: I0121 15:51:44.414087 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-5vvkf" podStartSLOduration=3.4140054060000002 podStartE2EDuration="3.414005406s" podCreationTimestamp="2026-01-21 15:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:44.399573083 +0000 UTC m=+1669.324062705" watchObservedRunningTime="2026-01-21 15:51:44.414005406 +0000 UTC m=+1669.338495028" Jan 21 15:51:47 crc kubenswrapper[4773]: I0121 15:51:47.415431 4773 generic.go:334] "Generic (PLEG): container finished" podID="15296d45-0901-4f95-b397-841dc24e8a08" containerID="4c1b9e99d4d1a77f6f4c818545f52e8f4f4a18fde8a2b8c17f5988d7d732244d" exitCode=0 Jan 21 15:51:47 crc kubenswrapper[4773]: I0121 15:51:47.415587 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-5vvkf" event={"ID":"15296d45-0901-4f95-b397-841dc24e8a08","Type":"ContainerDied","Data":"4c1b9e99d4d1a77f6f4c818545f52e8f4f4a18fde8a2b8c17f5988d7d732244d"} Jan 21 15:51:48 crc kubenswrapper[4773]: I0121 15:51:48.428749 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4","Type":"ContainerStarted","Data":"6c7a6fa36cb3f836b086715d46cf7bead180db530e686065163d192999568b0a"} Jan 21 15:51:48 crc kubenswrapper[4773]: I0121 15:51:48.891141 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.002236 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-certs\") pod \"15296d45-0901-4f95-b397-841dc24e8a08\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.002407 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-config-data\") pod \"15296d45-0901-4f95-b397-841dc24e8a08\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.002493 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-scripts\") pod \"15296d45-0901-4f95-b397-841dc24e8a08\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.002628 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-combined-ca-bundle\") pod \"15296d45-0901-4f95-b397-841dc24e8a08\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.002648 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v97rv\" (UniqueName: \"kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-kube-api-access-v97rv\") pod \"15296d45-0901-4f95-b397-841dc24e8a08\" (UID: \"15296d45-0901-4f95-b397-841dc24e8a08\") " Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.008353 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-certs" (OuterVolumeSpecName: "certs") pod "15296d45-0901-4f95-b397-841dc24e8a08" (UID: "15296d45-0901-4f95-b397-841dc24e8a08"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.008613 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-kube-api-access-v97rv" (OuterVolumeSpecName: "kube-api-access-v97rv") pod "15296d45-0901-4f95-b397-841dc24e8a08" (UID: "15296d45-0901-4f95-b397-841dc24e8a08"). InnerVolumeSpecName "kube-api-access-v97rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.012025 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-scripts" (OuterVolumeSpecName: "scripts") pod "15296d45-0901-4f95-b397-841dc24e8a08" (UID: "15296d45-0901-4f95-b397-841dc24e8a08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.030893 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15296d45-0901-4f95-b397-841dc24e8a08" (UID: "15296d45-0901-4f95-b397-841dc24e8a08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.032681 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-config-data" (OuterVolumeSpecName: "config-data") pod "15296d45-0901-4f95-b397-841dc24e8a08" (UID: "15296d45-0901-4f95-b397-841dc24e8a08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.105483 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.105524 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.105538 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15296d45-0901-4f95-b397-841dc24e8a08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.105549 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v97rv\" (UniqueName: \"kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-kube-api-access-v97rv\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.105557 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/15296d45-0901-4f95-b397-841dc24e8a08-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.444892 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-5vvkf" event={"ID":"15296d45-0901-4f95-b397-841dc24e8a08","Type":"ContainerDied","Data":"78de6156d66e4928f7cb130bb7625b99a1409fb1c1b35855c7e7ffb3e3e6399b"} Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.445823 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78de6156d66e4928f7cb130bb7625b99a1409fb1c1b35855c7e7ffb3e3e6399b" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.444973 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-5vvkf" Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.548837 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.549176 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="6c718710-7612-4e1f-b166-4c031c7051da" containerName="cloudkitty-proc" containerID="cri-o://9eab6517396c2473e4608a5bebc694409a9bf9f73514a98ff5cba3407e2404e7" gracePeriod=30 Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.561366 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.561646 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerName="cloudkitty-api-log" containerID="cri-o://ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6" gracePeriod=30 Jan 21 15:51:49 crc kubenswrapper[4773]: I0121 15:51:49.561708 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerName="cloudkitty-api" containerID="cri-o://dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7" gracePeriod=30 Jan 21 15:51:50 crc kubenswrapper[4773]: I0121 15:51:50.959454 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Jan 21 15:51:51 crc kubenswrapper[4773]: I0121 15:51:51.379683 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.204:8889/healthcheck\": dial tcp 10.217.0.204:8889: connect: connection refused" Jan 21 15:51:51 crc kubenswrapper[4773]: I0121 15:51:51.473949 4773 generic.go:334] "Generic (PLEG): container finished" podID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerID="ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6" exitCode=143 Jan 21 15:51:51 crc kubenswrapper[4773]: I0121 15:51:51.474049 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c7bac139-85d2-4d70-b755-22c0e0e8fa92","Type":"ContainerDied","Data":"ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6"} Jan 21 15:51:51 crc kubenswrapper[4773]: I0121 15:51:51.476520 4773 generic.go:334] "Generic (PLEG): container finished" podID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" containerID="759d30338b996eef460469dd5376f15317f00b658a5853172f67b625104e5cee" exitCode=0 Jan 21 15:51:51 crc kubenswrapper[4773]: I0121 15:51:51.476562 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5b1166d-2f9b-452c-a0b2-e7f21998ff45","Type":"ContainerDied","Data":"759d30338b996eef460469dd5376f15317f00b658a5853172f67b625104e5cee"} Jan 21 15:51:51 crc kubenswrapper[4773]: I0121 15:51:51.598433 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1849053d-528d-42bf-93f3-31cb3ef1c91e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.372897 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.379647 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.384463 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:51:52 crc kubenswrapper[4773]: E0121 15:51:52.384834 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.478454 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-internal-tls-certs\") pod \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.478588 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-config-data\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.478615 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-scripts\") pod \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.485929 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-scripts" (OuterVolumeSpecName: "scripts") pod "c7bac139-85d2-4d70-b755-22c0e0e8fa92" (UID: "c7bac139-85d2-4d70-b755-22c0e0e8fa92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.489890 4773 generic.go:334] "Generic (PLEG): container finished" podID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerID="dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7" exitCode=0 Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.489966 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c7bac139-85d2-4d70-b755-22c0e0e8fa92","Type":"ContainerDied","Data":"dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7"} Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.490000 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c7bac139-85d2-4d70-b755-22c0e0e8fa92","Type":"ContainerDied","Data":"84151022ca64a50747c5455f63a6f683dca3e43983630576461bc75521bcab90"} Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.492249 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.493192 4773 scope.go:117] "RemoveContainer" containerID="dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.506919 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5b1166d-2f9b-452c-a0b2-e7f21998ff45","Type":"ContainerDied","Data":"4e7bf9a720ae0056995d1bf8e160e926229c553ed6c30a9a3f731154ba8dbbf1"} Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.507022 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.513401 4773 generic.go:334] "Generic (PLEG): container finished" podID="6c718710-7612-4e1f-b166-4c031c7051da" containerID="9eab6517396c2473e4608a5bebc694409a9bf9f73514a98ff5cba3407e2404e7" exitCode=0 Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.513451 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"6c718710-7612-4e1f-b166-4c031c7051da","Type":"ContainerDied","Data":"9eab6517396c2473e4608a5bebc694409a9bf9f73514a98ff5cba3407e2404e7"} Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.518917 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.518977 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-certs\") pod \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519008 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data\") pod \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519055 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-erlang-cookie\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519093 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djkvd\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-kube-api-access-djkvd\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519203 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-plugins\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519251 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7bac139-85d2-4d70-b755-22c0e0e8fa92-logs\") pod \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519323 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-server-conf\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519341 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-combined-ca-bundle\") pod \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519378 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-plugins-conf\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519406 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data-custom\") pod \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519422 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-public-tls-certs\") pod \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519443 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-confd\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519472 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58c8x\" (UniqueName: \"kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-kube-api-access-58c8x\") pod \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\" (UID: \"c7bac139-85d2-4d70-b755-22c0e0e8fa92\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519499 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-pod-info\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519520 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-tls\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.519536 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-erlang-cookie-secret\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.520502 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.522223 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7bac139-85d2-4d70-b755-22c0e0e8fa92-logs" (OuterVolumeSpecName: "logs") pod "c7bac139-85d2-4d70-b755-22c0e0e8fa92" (UID: "c7bac139-85d2-4d70-b755-22c0e0e8fa92"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.529097 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-certs" (OuterVolumeSpecName: "certs") pod "c7bac139-85d2-4d70-b755-22c0e0e8fa92" (UID: "c7bac139-85d2-4d70-b755-22c0e0e8fa92"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.534133 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-kube-api-access-58c8x" (OuterVolumeSpecName: "kube-api-access-58c8x") pod "c7bac139-85d2-4d70-b755-22c0e0e8fa92" (UID: "c7bac139-85d2-4d70-b755-22c0e0e8fa92"). InnerVolumeSpecName "kube-api-access-58c8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.579959 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7bac139-85d2-4d70-b755-22c0e0e8fa92" (UID: "c7bac139-85d2-4d70-b755-22c0e0e8fa92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.584024 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c7bac139-85d2-4d70-b755-22c0e0e8fa92" (UID: "c7bac139-85d2-4d70-b755-22c0e0e8fa92"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.587517 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e5b1166d-2f9b-452c-a0b2-e7f21998ff45" (UID: "e5b1166d-2f9b-452c-a0b2-e7f21998ff45"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.616801 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e5b1166d-2f9b-452c-a0b2-e7f21998ff45" (UID: "e5b1166d-2f9b-452c-a0b2-e7f21998ff45"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.625246 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.625273 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.625283 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.625294 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7bac139-85d2-4d70-b755-22c0e0e8fa92-logs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.625301 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.625309 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.625317 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58c8x\" (UniqueName: \"kubernetes.io/projected/c7bac139-85d2-4d70-b755-22c0e0e8fa92-kube-api-access-58c8x\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.628205 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data" (OuterVolumeSpecName: "config-data") pod "c7bac139-85d2-4d70-b755-22c0e0e8fa92" (UID: "c7bac139-85d2-4d70-b755-22c0e0e8fa92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.635844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-kube-api-access-djkvd" (OuterVolumeSpecName: "kube-api-access-djkvd") pod "e5b1166d-2f9b-452c-a0b2-e7f21998ff45" (UID: "e5b1166d-2f9b-452c-a0b2-e7f21998ff45"). InnerVolumeSpecName "kube-api-access-djkvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.711912 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c7bac139-85d2-4d70-b755-22c0e0e8fa92" (UID: "c7bac139-85d2-4d70-b755-22c0e0e8fa92"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.721001 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c7bac139-85d2-4d70-b755-22c0e0e8fa92" (UID: "c7bac139-85d2-4d70-b755-22c0e0e8fa92"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: E0121 15:51:52.725995 4773 reconciler_common.go:156] "operationExecutor.UnmountVolume failed (controllerAttachDetachEnabled true) for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") : UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/e5b1166d-2f9b-452c-a0b2-e7f21998ff45/volumes/kubernetes.io~csi/pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/e5b1166d-2f9b-452c-a0b2-e7f21998ff45/volumes/kubernetes.io~csi/pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c/vol_data.json]: open /var/lib/kubelet/pods/e5b1166d-2f9b-452c-a0b2-e7f21998ff45/volumes/kubernetes.io~csi/pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c/vol_data.json: no such file or directory" err="UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") pod \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\" (UID: \"e5b1166d-2f9b-452c-a0b2-e7f21998ff45\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/e5b1166d-2f9b-452c-a0b2-e7f21998ff45/volumes/kubernetes.io~csi/pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/e5b1166d-2f9b-452c-a0b2-e7f21998ff45/volumes/kubernetes.io~csi/pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c/vol_data.json]: open /var/lib/kubelet/pods/e5b1166d-2f9b-452c-a0b2-e7f21998ff45/volumes/kubernetes.io~csi/pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c/vol_data.json: no such file or directory" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.726922 4773 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.726951 4773 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.726967 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7bac139-85d2-4d70-b755-22c0e0e8fa92-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.726980 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djkvd\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-kube-api-access-djkvd\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.740646 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-pod-info" (OuterVolumeSpecName: "pod-info") pod "e5b1166d-2f9b-452c-a0b2-e7f21998ff45" (UID: "e5b1166d-2f9b-452c-a0b2-e7f21998ff45"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.740895 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e5b1166d-2f9b-452c-a0b2-e7f21998ff45" (UID: "e5b1166d-2f9b-452c-a0b2-e7f21998ff45"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.741219 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-server-conf" (OuterVolumeSpecName: "server-conf") pod "e5b1166d-2f9b-452c-a0b2-e7f21998ff45" (UID: "e5b1166d-2f9b-452c-a0b2-e7f21998ff45"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.741784 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-config-data" (OuterVolumeSpecName: "config-data") pod "e5b1166d-2f9b-452c-a0b2-e7f21998ff45" (UID: "e5b1166d-2f9b-452c-a0b2-e7f21998ff45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.743842 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c" (OuterVolumeSpecName: "persistence") pod "e5b1166d-2f9b-452c-a0b2-e7f21998ff45" (UID: "e5b1166d-2f9b-452c-a0b2-e7f21998ff45"). InnerVolumeSpecName "pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.744024 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e5b1166d-2f9b-452c-a0b2-e7f21998ff45" (UID: "e5b1166d-2f9b-452c-a0b2-e7f21998ff45"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.749100 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e5b1166d-2f9b-452c-a0b2-e7f21998ff45" (UID: "e5b1166d-2f9b-452c-a0b2-e7f21998ff45"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.753580 4773 scope.go:117] "RemoveContainer" containerID="ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.779839 4773 scope.go:117] "RemoveContainer" containerID="dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7" Jan 21 15:51:52 crc kubenswrapper[4773]: E0121 15:51:52.781940 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7\": container with ID starting with dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7 not found: ID does not exist" containerID="dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.781977 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7"} err="failed to get container status \"dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7\": rpc error: code = NotFound desc = could not find container \"dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7\": container with ID starting with dfcf3bf0c89d719ac576671a87279c8906011c9d8d0a3c066d272e6365041df7 not found: ID does not exist" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.781997 4773 scope.go:117] "RemoveContainer" containerID="ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6" Jan 21 15:51:52 crc kubenswrapper[4773]: E0121 15:51:52.782328 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6\": container with ID starting with ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6 not found: ID does not exist" containerID="ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.782371 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6"} err="failed to get container status \"ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6\": rpc error: code = NotFound desc = could not find container \"ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6\": container with ID starting with ce675cf905e8e07b35c000385f7a3f08d11c2d43c6fd05ab046ff8afd02ce1b6 not found: ID does not exist" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.782400 4773 scope.go:117] "RemoveContainer" containerID="759d30338b996eef460469dd5376f15317f00b658a5853172f67b625104e5cee" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.792876 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e5b1166d-2f9b-452c-a0b2-e7f21998ff45" (UID: "e5b1166d-2f9b-452c-a0b2-e7f21998ff45"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.829368 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.829423 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") on node \"crc\" " Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.829438 4773 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.829449 4773 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.829459 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.829469 4773 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.829479 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.829491 4773 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5b1166d-2f9b-452c-a0b2-e7f21998ff45-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.858224 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.871917 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.894493 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.912151 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.917766 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:51:52 crc kubenswrapper[4773]: E0121 15:51:52.918283 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" containerName="rabbitmq" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.918304 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" containerName="rabbitmq" Jan 21 15:51:52 crc kubenswrapper[4773]: E0121 15:51:52.918321 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerName="cloudkitty-api" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.918328 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerName="cloudkitty-api" Jan 21 15:51:52 crc kubenswrapper[4773]: E0121 15:51:52.918350 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" containerName="setup-container" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.918357 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" containerName="setup-container" Jan 21 15:51:52 crc kubenswrapper[4773]: E0121 15:51:52.918370 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerName="cloudkitty-api-log" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.918376 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerName="cloudkitty-api-log" Jan 21 15:51:52 crc kubenswrapper[4773]: E0121 15:51:52.918393 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15296d45-0901-4f95-b397-841dc24e8a08" containerName="cloudkitty-storageinit" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.918399 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="15296d45-0901-4f95-b397-841dc24e8a08" containerName="cloudkitty-storageinit" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.918602 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" containerName="rabbitmq" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.918621 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerName="cloudkitty-api" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.918629 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" containerName="cloudkitty-api-log" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.918637 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="15296d45-0901-4f95-b397-841dc24e8a08" containerName="cloudkitty-storageinit" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.919794 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.922060 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.922286 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.922408 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.927535 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.929363 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.933310 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.933581 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.933717 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.933814 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.933924 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.934018 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.934407 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lxb5s" Jan 21 15:51:52 crc kubenswrapper[4773]: I0121 15:51:52.939105 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.033471 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c80bcc16-be97-47d8-afb6-7a1378546882-config-data\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.033591 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.033660 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.033805 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c80bcc16-be97-47d8-afb6-7a1378546882-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.033837 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.033927 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034061 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034128 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c80bcc16-be97-47d8-afb6-7a1378546882-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034248 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034290 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-config-data\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034315 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7b666481-ecf0-4091-8d05-b403082294fe-certs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034374 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034429 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c80bcc16-be97-47d8-afb6-7a1378546882-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034480 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c80bcc16-be97-47d8-afb6-7a1378546882-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv4l5\" (UniqueName: \"kubernetes.io/projected/c80bcc16-be97-47d8-afb6-7a1378546882-kube-api-access-hv4l5\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034544 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034561 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-scripts\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034591 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b666481-ecf0-4091-8d05-b403082294fe-logs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.034647 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpfm6\" (UniqueName: \"kubernetes.io/projected/7b666481-ecf0-4091-8d05-b403082294fe-kube-api-access-tpfm6\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.137072 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.138321 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-config-data\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.138433 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7b666481-ecf0-4091-8d05-b403082294fe-certs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.138556 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.138672 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c80bcc16-be97-47d8-afb6-7a1378546882-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.138770 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c80bcc16-be97-47d8-afb6-7a1378546882-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.138907 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv4l5\" (UniqueName: \"kubernetes.io/projected/c80bcc16-be97-47d8-afb6-7a1378546882-kube-api-access-hv4l5\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.139023 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.139127 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-scripts\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.139246 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b666481-ecf0-4091-8d05-b403082294fe-logs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.139403 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpfm6\" (UniqueName: \"kubernetes.io/projected/7b666481-ecf0-4091-8d05-b403082294fe-kube-api-access-tpfm6\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.139530 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c80bcc16-be97-47d8-afb6-7a1378546882-config-data\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.139607 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c80bcc16-be97-47d8-afb6-7a1378546882-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.139747 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.139887 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.140016 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c80bcc16-be97-47d8-afb6-7a1378546882-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.140085 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.140183 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.140333 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.140465 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c80bcc16-be97-47d8-afb6-7a1378546882-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.142299 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.142566 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.142669 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-config-data\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.142992 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.139922 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b666481-ecf0-4091-8d05-b403082294fe-logs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.143472 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c80bcc16-be97-47d8-afb6-7a1378546882-config-data\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.143761 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.143853 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7b666481-ecf0-4091-8d05-b403082294fe-certs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.145278 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c80bcc16-be97-47d8-afb6-7a1378546882-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.145388 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-scripts\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.150518 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.150770 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c80bcc16-be97-47d8-afb6-7a1378546882-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.150924 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c80bcc16-be97-47d8-afb6-7a1378546882-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.151028 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.154232 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b666481-ecf0-4091-8d05-b403082294fe-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.155784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c80bcc16-be97-47d8-afb6-7a1378546882-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.157169 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv4l5\" (UniqueName: \"kubernetes.io/projected/c80bcc16-be97-47d8-afb6-7a1378546882-kube-api-access-hv4l5\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.158574 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpfm6\" (UniqueName: \"kubernetes.io/projected/7b666481-ecf0-4091-8d05-b403082294fe-kube-api-access-tpfm6\") pod \"cloudkitty-api-0\" (UID: \"7b666481-ecf0-4091-8d05-b403082294fe\") " pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.216399 4773 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.217040 4773 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c") on node "crc" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.222532 4773 scope.go:117] "RemoveContainer" containerID="a2d82cd99a3e89e61c6361ec20503a0529ff9b77bb61e768caaf32d1b0602c8e" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.242117 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.247284 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.252111 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.253300 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.253343 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/84c1d087d48810d54c9111c49096770b7a04a8eec4ea7f577cb5bfe0d8b6f672/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.317813 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ef1cd73-ac7c-4b39-a5e3-2f86909ff20c\") pod \"rabbitmq-server-0\" (UID: \"c80bcc16-be97-47d8-afb6-7a1378546882\") " pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.426221 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7bac139-85d2-4d70-b755-22c0e0e8fa92" path="/var/lib/kubelet/pods/c7bac139-85d2-4d70-b755-22c0e0e8fa92/volumes" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.428101 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5b1166d-2f9b-452c-a0b2-e7f21998ff45" path="/var/lib/kubelet/pods/e5b1166d-2f9b-452c-a0b2-e7f21998ff45/volumes" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.562277 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.568102 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4","Type":"ContainerStarted","Data":"49aa5bb026cbf527960661bb2b0522bc0e85c78188bb258805b1352c9b0a9992"} Jan 21 15:51:53 crc kubenswrapper[4773]: I0121 15:51:53.824791 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.005553 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.172528 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.196626 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-combined-ca-bundle\") pod \"6c718710-7612-4e1f-b166-4c031c7051da\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.196804 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-certs\") pod \"6c718710-7612-4e1f-b166-4c031c7051da\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.196965 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data\") pod \"6c718710-7612-4e1f-b166-4c031c7051da\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.197005 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-scripts\") pod \"6c718710-7612-4e1f-b166-4c031c7051da\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.197064 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data-custom\") pod \"6c718710-7612-4e1f-b166-4c031c7051da\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.197086 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6chpr\" (UniqueName: \"kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-kube-api-access-6chpr\") pod \"6c718710-7612-4e1f-b166-4c031c7051da\" (UID: \"6c718710-7612-4e1f-b166-4c031c7051da\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.202914 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-scripts" (OuterVolumeSpecName: "scripts") pod "6c718710-7612-4e1f-b166-4c031c7051da" (UID: "6c718710-7612-4e1f-b166-4c031c7051da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.202998 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-certs" (OuterVolumeSpecName: "certs") pod "6c718710-7612-4e1f-b166-4c031c7051da" (UID: "6c718710-7612-4e1f-b166-4c031c7051da"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.222663 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6c718710-7612-4e1f-b166-4c031c7051da" (UID: "6c718710-7612-4e1f-b166-4c031c7051da"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.229032 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-kube-api-access-6chpr" (OuterVolumeSpecName: "kube-api-access-6chpr") pod "6c718710-7612-4e1f-b166-4c031c7051da" (UID: "6c718710-7612-4e1f-b166-4c031c7051da"). InnerVolumeSpecName "kube-api-access-6chpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.286887 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data" (OuterVolumeSpecName: "config-data") pod "6c718710-7612-4e1f-b166-4c031c7051da" (UID: "6c718710-7612-4e1f-b166-4c031c7051da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.301168 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-certs\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.301198 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.301209 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.301220 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.301233 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6chpr\" (UniqueName: \"kubernetes.io/projected/6c718710-7612-4e1f-b166-4c031c7051da-kube-api-access-6chpr\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.301254 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c718710-7612-4e1f-b166-4c031c7051da" (UID: "6c718710-7612-4e1f-b166-4c031c7051da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.363951 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-ll9rg"] Jan 21 15:51:54 crc kubenswrapper[4773]: E0121 15:51:54.381687 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c718710-7612-4e1f-b166-4c031c7051da" containerName="cloudkitty-proc" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.381745 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c718710-7612-4e1f-b166-4c031c7051da" containerName="cloudkitty-proc" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.382018 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c718710-7612-4e1f-b166-4c031c7051da" containerName="cloudkitty-proc" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.383406 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.390173 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-ll9rg"] Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.392035 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.402604 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grtz9\" (UniqueName: \"kubernetes.io/projected/a95e0438-f08e-4f11-854a-cd5d64f30408-kube-api-access-grtz9\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.402972 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.403146 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-config\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.403266 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.403773 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.403877 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.404019 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.404336 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c718710-7612-4e1f-b166-4c031c7051da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.505553 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grtz9\" (UniqueName: \"kubernetes.io/projected/a95e0438-f08e-4f11-854a-cd5d64f30408-kube-api-access-grtz9\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.505661 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.505718 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-config\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.505738 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.505755 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.505781 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.505827 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.506746 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.509039 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.509125 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.509541 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.509955 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-config\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.511178 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.557504 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grtz9\" (UniqueName: \"kubernetes.io/projected/a95e0438-f08e-4f11-854a-cd5d64f30408-kube-api-access-grtz9\") pod \"dnsmasq-dns-dbb88bf8c-ll9rg\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.573883 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.652766 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"6c718710-7612-4e1f-b166-4c031c7051da","Type":"ContainerDied","Data":"2588bc98d01b1289e925af2999552b1f558973b3fd34b6ffb6b2f4cd81a8fae8"} Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.652816 4773 scope.go:117] "RemoveContainer" containerID="9eab6517396c2473e4608a5bebc694409a9bf9f73514a98ff5cba3407e2404e7" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.652925 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.661056 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.714406 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-plugins\") pod \"1849053d-528d-42bf-93f3-31cb3ef1c91e\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.714521 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-plugins-conf\") pod \"1849053d-528d-42bf-93f3-31cb3ef1c91e\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.714546 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-tls\") pod \"1849053d-528d-42bf-93f3-31cb3ef1c91e\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.714662 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgzbf\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-kube-api-access-vgzbf\") pod \"1849053d-528d-42bf-93f3-31cb3ef1c91e\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.714713 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-server-conf\") pod \"1849053d-528d-42bf-93f3-31cb3ef1c91e\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.714749 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-confd\") pod \"1849053d-528d-42bf-93f3-31cb3ef1c91e\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.714782 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1849053d-528d-42bf-93f3-31cb3ef1c91e-pod-info\") pod \"1849053d-528d-42bf-93f3-31cb3ef1c91e\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.720829 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\") pod \"1849053d-528d-42bf-93f3-31cb3ef1c91e\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.720930 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-erlang-cookie\") pod \"1849053d-528d-42bf-93f3-31cb3ef1c91e\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.721054 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1849053d-528d-42bf-93f3-31cb3ef1c91e-erlang-cookie-secret\") pod \"1849053d-528d-42bf-93f3-31cb3ef1c91e\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.721090 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-config-data\") pod \"1849053d-528d-42bf-93f3-31cb3ef1c91e\" (UID: \"1849053d-528d-42bf-93f3-31cb3ef1c91e\") " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.724402 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1849053d-528d-42bf-93f3-31cb3ef1c91e" (UID: "1849053d-528d-42bf-93f3-31cb3ef1c91e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.725141 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1849053d-528d-42bf-93f3-31cb3ef1c91e" (UID: "1849053d-528d-42bf-93f3-31cb3ef1c91e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.727622 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1849053d-528d-42bf-93f3-31cb3ef1c91e" (UID: "1849053d-528d-42bf-93f3-31cb3ef1c91e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.737506 4773 generic.go:334] "Generic (PLEG): container finished" podID="1849053d-528d-42bf-93f3-31cb3ef1c91e" containerID="b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01" exitCode=0 Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.737663 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.738247 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1849053d-528d-42bf-93f3-31cb3ef1c91e","Type":"ContainerDied","Data":"b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01"} Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.738279 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1849053d-528d-42bf-93f3-31cb3ef1c91e","Type":"ContainerDied","Data":"e3acd80e2d389ae53d3d78bdec15097675f1406fef4a69507d88c3989e00585e"} Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.779252 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.783642 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1849053d-528d-42bf-93f3-31cb3ef1c91e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1849053d-528d-42bf-93f3-31cb3ef1c91e" (UID: "1849053d-528d-42bf-93f3-31cb3ef1c91e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.806032 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"7b666481-ecf0-4091-8d05-b403082294fe","Type":"ContainerStarted","Data":"d7067f717a6b51397cd538f143207ecc923f9eaf2cb6fcf0e8507a55730cbc34"} Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.806087 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"7b666481-ecf0-4091-8d05-b403082294fe","Type":"ContainerStarted","Data":"9eedd329f176b65d831db5067b894fe2fe4809d59750935be008c78d58bd51d7"} Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.830731 4773 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1849053d-528d-42bf-93f3-31cb3ef1c91e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.830762 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.830771 4773 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.830779 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.850235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-kube-api-access-vgzbf" (OuterVolumeSpecName: "kube-api-access-vgzbf") pod "1849053d-528d-42bf-93f3-31cb3ef1c91e" (UID: "1849053d-528d-42bf-93f3-31cb3ef1c91e"). InnerVolumeSpecName "kube-api-access-vgzbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.855613 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.864042 4773 scope.go:117] "RemoveContainer" containerID="b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.864170 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c80bcc16-be97-47d8-afb6-7a1378546882","Type":"ContainerStarted","Data":"b6ccf5f922c57f2678dadbe1b00fa6560afb791473e06042eafebd6d0917ae86"} Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.871026 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1849053d-528d-42bf-93f3-31cb3ef1c91e-pod-info" (OuterVolumeSpecName: "pod-info") pod "1849053d-528d-42bf-93f3-31cb3ef1c91e" (UID: "1849053d-528d-42bf-93f3-31cb3ef1c91e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.886106 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6" (OuterVolumeSpecName: "persistence") pod "1849053d-528d-42bf-93f3-31cb3ef1c91e" (UID: "1849053d-528d-42bf-93f3-31cb3ef1c91e"). InnerVolumeSpecName "pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.888144 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1849053d-528d-42bf-93f3-31cb3ef1c91e" (UID: "1849053d-528d-42bf-93f3-31cb3ef1c91e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.899466 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:51:54 crc kubenswrapper[4773]: E0121 15:51:54.900030 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1849053d-528d-42bf-93f3-31cb3ef1c91e" containerName="setup-container" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.900050 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1849053d-528d-42bf-93f3-31cb3ef1c91e" containerName="setup-container" Jan 21 15:51:54 crc kubenswrapper[4773]: E0121 15:51:54.900063 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1849053d-528d-42bf-93f3-31cb3ef1c91e" containerName="rabbitmq" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.900068 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1849053d-528d-42bf-93f3-31cb3ef1c91e" containerName="rabbitmq" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.900272 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1849053d-528d-42bf-93f3-31cb3ef1c91e" containerName="rabbitmq" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.901163 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.907176 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.917821 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.932767 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.932826 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgzbf\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-kube-api-access-vgzbf\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.932847 4773 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1849053d-528d-42bf-93f3-31cb3ef1c91e-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.932887 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\") on node \"crc\" " Jan 21 15:51:54 crc kubenswrapper[4773]: I0121 15:51:54.941684 4773 scope.go:117] "RemoveContainer" containerID="67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.035292 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.035617 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxzjs\" (UniqueName: \"kubernetes.io/projected/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-kube-api-access-vxzjs\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.035797 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-certs\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.035829 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-scripts\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.035848 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-config-data\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.035900 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.047739 4773 scope.go:117] "RemoveContainer" containerID="b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01" Jan 21 15:51:55 crc kubenswrapper[4773]: E0121 15:51:55.049254 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01\": container with ID starting with b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01 not found: ID does not exist" containerID="b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.049290 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01"} err="failed to get container status \"b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01\": rpc error: code = NotFound desc = could not find container \"b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01\": container with ID starting with b9cba30ddbac71a1fd968fb13eb51c09d1e6575dbb61a25426f3f4f6104f9e01 not found: ID does not exist" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.049317 4773 scope.go:117] "RemoveContainer" containerID="67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469" Jan 21 15:51:55 crc kubenswrapper[4773]: E0121 15:51:55.054127 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469\": container with ID starting with 67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469 not found: ID does not exist" containerID="67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.054171 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469"} err="failed to get container status \"67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469\": rpc error: code = NotFound desc = could not find container \"67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469\": container with ID starting with 67a3d49b2a44624f8c57c27b00161a5673f1606ed0b92a042edd2085c268a469 not found: ID does not exist" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.137459 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-certs\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.137543 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-scripts\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.137564 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-config-data\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.137602 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.137673 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxzjs\" (UniqueName: \"kubernetes.io/projected/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-kube-api-access-vxzjs\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.137706 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.146359 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-scripts\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.146675 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-config-data\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.146687 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.146835 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.147744 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-config-data" (OuterVolumeSpecName: "config-data") pod "1849053d-528d-42bf-93f3-31cb3ef1c91e" (UID: "1849053d-528d-42bf-93f3-31cb3ef1c91e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.148099 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-certs\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.170046 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxzjs\" (UniqueName: \"kubernetes.io/projected/aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109-kube-api-access-vxzjs\") pod \"cloudkitty-proc-0\" (UID: \"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109\") " pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.224762 4773 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.224996 4773 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6") on node "crc" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.239389 4773 reconciler_common.go:293] "Volume detached for volume \"pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.239442 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.243557 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.290979 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-server-conf" (OuterVolumeSpecName: "server-conf") pod "1849053d-528d-42bf-93f3-31cb3ef1c91e" (UID: "1849053d-528d-42bf-93f3-31cb3ef1c91e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.340952 4773 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1849053d-528d-42bf-93f3-31cb3ef1c91e-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.446303 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1849053d-528d-42bf-93f3-31cb3ef1c91e" (UID: "1849053d-528d-42bf-93f3-31cb3ef1c91e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.474819 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1849053d-528d-42bf-93f3-31cb3ef1c91e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.490324 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c718710-7612-4e1f-b166-4c031c7051da" path="/var/lib/kubelet/pods/6c718710-7612-4e1f-b166-4c031c7051da/volumes" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.494577 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-ll9rg"] Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.796623 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.818704 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.833871 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.849732 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.852273 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.855529 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.855573 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.855825 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.855837 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.856056 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f7cnv" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.858168 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.859053 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.892372 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.906099 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4","Type":"ContainerStarted","Data":"edd59d064ef8c053e3998920a39c1c26c8c99b931b52a1ae8b381609f333eaac"} Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.908264 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" event={"ID":"a95e0438-f08e-4f11-854a-cd5d64f30408","Type":"ContainerStarted","Data":"ebfa282abc95aeb271385ee7a2c02eee5043def765976f6f09045755674232ff"} Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.914291 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"7b666481-ecf0-4091-8d05-b403082294fe","Type":"ContainerStarted","Data":"db6e173d42b262e27569b6b709dece190c258d36bbce08e56269ca4512db1ed6"} Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.914585 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.919167 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109","Type":"ContainerStarted","Data":"60f895ee2a2c8f47441b23b4a2d21144f62a384b649c80408d3db058d5230d41"} Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.942158 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.942139251 podStartE2EDuration="3.942139251s" podCreationTimestamp="2026-01-21 15:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:55.939255951 +0000 UTC m=+1680.863745593" watchObservedRunningTime="2026-01-21 15:51:55.942139251 +0000 UTC m=+1680.866628863" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.990394 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.990464 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46pk2\" (UniqueName: \"kubernetes.io/projected/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-kube-api-access-46pk2\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.990522 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.990592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.990735 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.990767 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.990789 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.990811 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.990988 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.991073 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:55 crc kubenswrapper[4773]: I0121 15:51:55.991139 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.093122 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.093166 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.093182 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.093208 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.093240 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.093448 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.093475 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.093532 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.093579 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46pk2\" (UniqueName: \"kubernetes.io/projected/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-kube-api-access-46pk2\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.093618 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.093660 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.094178 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.095967 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.097097 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.098383 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.101123 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.101583 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.104643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.105073 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.105119 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/979200276d874b4c9cbd62a4a1053c8813c6c43fc0020f0aeefb9b7d36abaec8/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.106313 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.113495 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.122816 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46pk2\" (UniqueName: \"kubernetes.io/projected/13e55b8e-491d-4d97-a0cf-56433eb4a7f1-kube-api-access-46pk2\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.153313 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a0265fd-6903-44e5-9364-76c4c1b3a3a6\") pod \"rabbitmq-cell1-server-0\" (UID: \"13e55b8e-491d-4d97-a0cf-56433eb4a7f1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.177203 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.802414 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.930264 4773 generic.go:334] "Generic (PLEG): container finished" podID="a95e0438-f08e-4f11-854a-cd5d64f30408" containerID="4fa644aaf140021cff9932968044c454cc956b55d18d872b62420cae5261da5c" exitCode=0 Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.930337 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" event={"ID":"a95e0438-f08e-4f11-854a-cd5d64f30408","Type":"ContainerDied","Data":"4fa644aaf140021cff9932968044c454cc956b55d18d872b62420cae5261da5c"} Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.932518 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c80bcc16-be97-47d8-afb6-7a1378546882","Type":"ContainerStarted","Data":"9889d7352e5247f9a7978e1e95d718db849a0ebb358eb063433db34e38fb1e9d"} Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.935197 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13e55b8e-491d-4d97-a0cf-56433eb4a7f1","Type":"ContainerStarted","Data":"4810c15bd565c9c0b3b02e367c4670e73fdb0c6c5a87f874b17931b4d0a0f932"} Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.936930 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109","Type":"ContainerStarted","Data":"6a7455a4d472db5df48b9e5db4d73944c9bb92c9a00916a7cc4117fe79ff6ed8"} Jan 21 15:51:56 crc kubenswrapper[4773]: I0121 15:51:56.980390 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.733138018 podStartE2EDuration="2.980367732s" podCreationTimestamp="2026-01-21 15:51:54 +0000 UTC" firstStartedPulling="2026-01-21 15:51:55.786520215 +0000 UTC m=+1680.711009837" lastFinishedPulling="2026-01-21 15:51:56.033749929 +0000 UTC m=+1680.958239551" observedRunningTime="2026-01-21 15:51:56.969722181 +0000 UTC m=+1681.894211803" watchObservedRunningTime="2026-01-21 15:51:56.980367732 +0000 UTC m=+1681.904857354" Jan 21 15:51:57 crc kubenswrapper[4773]: I0121 15:51:57.403830 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1849053d-528d-42bf-93f3-31cb3ef1c91e" path="/var/lib/kubelet/pods/1849053d-528d-42bf-93f3-31cb3ef1c91e/volumes" Jan 21 15:51:57 crc kubenswrapper[4773]: I0121 15:51:57.948863 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4","Type":"ContainerStarted","Data":"c3db622c00badf8367f798e824a36cc8557364a99c13c5e275035d303caee65a"} Jan 21 15:51:57 crc kubenswrapper[4773]: I0121 15:51:57.950492 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 15:51:57 crc kubenswrapper[4773]: I0121 15:51:57.954272 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" event={"ID":"a95e0438-f08e-4f11-854a-cd5d64f30408","Type":"ContainerStarted","Data":"5d1fb24672addedf9303e87ec2737eedeb84eea278c968997c9dcbb5d6ac5c08"} Jan 21 15:51:57 crc kubenswrapper[4773]: I0121 15:51:57.954311 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:51:57 crc kubenswrapper[4773]: I0121 15:51:57.977913 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.040971271 podStartE2EDuration="16.977895503s" podCreationTimestamp="2026-01-21 15:51:41 +0000 UTC" firstStartedPulling="2026-01-21 15:51:42.38361879 +0000 UTC m=+1667.308108412" lastFinishedPulling="2026-01-21 15:51:57.320543022 +0000 UTC m=+1682.245032644" observedRunningTime="2026-01-21 15:51:57.9748547 +0000 UTC m=+1682.899344322" watchObservedRunningTime="2026-01-21 15:51:57.977895503 +0000 UTC m=+1682.902385125" Jan 21 15:51:58 crc kubenswrapper[4773]: I0121 15:51:58.000865 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" podStartSLOduration=4.00084391 podStartE2EDuration="4.00084391s" podCreationTimestamp="2026-01-21 15:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:51:57.993645483 +0000 UTC m=+1682.918135095" watchObservedRunningTime="2026-01-21 15:51:58.00084391 +0000 UTC m=+1682.925333532" Jan 21 15:51:58 crc kubenswrapper[4773]: I0121 15:51:58.969240 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13e55b8e-491d-4d97-a0cf-56433eb4a7f1","Type":"ContainerStarted","Data":"561322339e80a6a7b73e56cf7579fa96a7a52312a92a8c2631e287fb763265c9"} Jan 21 15:52:04 crc kubenswrapper[4773]: I0121 15:52:04.383771 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:52:04 crc kubenswrapper[4773]: E0121 15:52:04.384550 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:52:04 crc kubenswrapper[4773]: I0121 15:52:04.663324 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:52:04 crc kubenswrapper[4773]: I0121 15:52:04.723769 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8g2hn"] Jan 21 15:52:04 crc kubenswrapper[4773]: I0121 15:52:04.902091 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-sm9b8"] Jan 21 15:52:04 crc kubenswrapper[4773]: I0121 15:52:04.904439 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:04 crc kubenswrapper[4773]: I0121 15:52:04.927883 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-sm9b8"] Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.017178 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9w5v\" (UniqueName: \"kubernetes.io/projected/597cc973-99fd-42ab-99a3-1009ad011d10-kube-api-access-m9w5v\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.017282 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-dns-svc\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.017302 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.017343 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.017554 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.017867 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-config\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.018035 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.032148 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" podUID="83e85c33-f0be-4571-b131-5991f5ae6979" containerName="dnsmasq-dns" containerID="cri-o://c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a" gracePeriod=10 Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.119949 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-dns-svc\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.120005 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.120040 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.120092 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.120158 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-config\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.120213 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.120252 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9w5v\" (UniqueName: \"kubernetes.io/projected/597cc973-99fd-42ab-99a3-1009ad011d10-kube-api-access-m9w5v\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.121662 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-dns-svc\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.122441 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.123071 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.123705 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.124303 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-config\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.124943 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/597cc973-99fd-42ab-99a3-1009ad011d10-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.152507 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9w5v\" (UniqueName: \"kubernetes.io/projected/597cc973-99fd-42ab-99a3-1009ad011d10-kube-api-access-m9w5v\") pod \"dnsmasq-dns-85f64749dc-sm9b8\" (UID: \"597cc973-99fd-42ab-99a3-1009ad011d10\") " pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.229415 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.805488 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-sm9b8"] Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.853615 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.936552 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn6zp\" (UniqueName: \"kubernetes.io/projected/83e85c33-f0be-4571-b131-5991f5ae6979-kube-api-access-rn6zp\") pod \"83e85c33-f0be-4571-b131-5991f5ae6979\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.936905 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-config\") pod \"83e85c33-f0be-4571-b131-5991f5ae6979\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.936944 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-nb\") pod \"83e85c33-f0be-4571-b131-5991f5ae6979\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.937005 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-swift-storage-0\") pod \"83e85c33-f0be-4571-b131-5991f5ae6979\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.937062 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-svc\") pod \"83e85c33-f0be-4571-b131-5991f5ae6979\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.937150 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-sb\") pod \"83e85c33-f0be-4571-b131-5991f5ae6979\" (UID: \"83e85c33-f0be-4571-b131-5991f5ae6979\") " Jan 21 15:52:05 crc kubenswrapper[4773]: I0121 15:52:05.957798 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e85c33-f0be-4571-b131-5991f5ae6979-kube-api-access-rn6zp" (OuterVolumeSpecName: "kube-api-access-rn6zp") pod "83e85c33-f0be-4571-b131-5991f5ae6979" (UID: "83e85c33-f0be-4571-b131-5991f5ae6979"). InnerVolumeSpecName "kube-api-access-rn6zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.027444 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "83e85c33-f0be-4571-b131-5991f5ae6979" (UID: "83e85c33-f0be-4571-b131-5991f5ae6979"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.048299 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn6zp\" (UniqueName: \"kubernetes.io/projected/83e85c33-f0be-4571-b131-5991f5ae6979-kube-api-access-rn6zp\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.048337 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.059246 4773 generic.go:334] "Generic (PLEG): container finished" podID="83e85c33-f0be-4571-b131-5991f5ae6979" containerID="c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a" exitCode=0 Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.059295 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.059325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" event={"ID":"83e85c33-f0be-4571-b131-5991f5ae6979","Type":"ContainerDied","Data":"c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a"} Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.059353 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-8g2hn" event={"ID":"83e85c33-f0be-4571-b131-5991f5ae6979","Type":"ContainerDied","Data":"39f5031adacc7c1d8df374bdef745970745795a22000c4dbd1521782bea75d0f"} Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.059371 4773 scope.go:117] "RemoveContainer" containerID="c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.061255 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" event={"ID":"597cc973-99fd-42ab-99a3-1009ad011d10","Type":"ContainerStarted","Data":"4e54a3a247c78a19fb23210a7bea0ea1ed9fdd1539559b54fef3e3a382f32ca8"} Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.062801 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "83e85c33-f0be-4571-b131-5991f5ae6979" (UID: "83e85c33-f0be-4571-b131-5991f5ae6979"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.067654 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "83e85c33-f0be-4571-b131-5991f5ae6979" (UID: "83e85c33-f0be-4571-b131-5991f5ae6979"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.071776 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "83e85c33-f0be-4571-b131-5991f5ae6979" (UID: "83e85c33-f0be-4571-b131-5991f5ae6979"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.078088 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-config" (OuterVolumeSpecName: "config") pod "83e85c33-f0be-4571-b131-5991f5ae6979" (UID: "83e85c33-f0be-4571-b131-5991f5ae6979"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.097485 4773 scope.go:117] "RemoveContainer" containerID="b49a8bb86ec7aae6be2d853682786210a618922c394e9afffcfb6b91726c12a4" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.125494 4773 scope.go:117] "RemoveContainer" containerID="c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a" Jan 21 15:52:06 crc kubenswrapper[4773]: E0121 15:52:06.125976 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a\": container with ID starting with c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a not found: ID does not exist" containerID="c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.126025 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a"} err="failed to get container status \"c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a\": rpc error: code = NotFound desc = could not find container \"c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a\": container with ID starting with c0b8b01c4f438f030722c9f893dd0b8dc4bdabf979afd16d0b91bb79f402d03a not found: ID does not exist" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.126051 4773 scope.go:117] "RemoveContainer" containerID="b49a8bb86ec7aae6be2d853682786210a618922c394e9afffcfb6b91726c12a4" Jan 21 15:52:06 crc kubenswrapper[4773]: E0121 15:52:06.126343 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b49a8bb86ec7aae6be2d853682786210a618922c394e9afffcfb6b91726c12a4\": container with ID starting with b49a8bb86ec7aae6be2d853682786210a618922c394e9afffcfb6b91726c12a4 not found: ID does not exist" containerID="b49a8bb86ec7aae6be2d853682786210a618922c394e9afffcfb6b91726c12a4" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.126375 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49a8bb86ec7aae6be2d853682786210a618922c394e9afffcfb6b91726c12a4"} err="failed to get container status \"b49a8bb86ec7aae6be2d853682786210a618922c394e9afffcfb6b91726c12a4\": rpc error: code = NotFound desc = could not find container \"b49a8bb86ec7aae6be2d853682786210a618922c394e9afffcfb6b91726c12a4\": container with ID starting with b49a8bb86ec7aae6be2d853682786210a618922c394e9afffcfb6b91726c12a4 not found: ID does not exist" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.150437 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.151023 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.151044 4773 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.151055 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e85c33-f0be-4571-b131-5991f5ae6979-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.398564 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8g2hn"] Jan 21 15:52:06 crc kubenswrapper[4773]: I0121 15:52:06.409235 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8g2hn"] Jan 21 15:52:07 crc kubenswrapper[4773]: I0121 15:52:07.078326 4773 generic.go:334] "Generic (PLEG): container finished" podID="597cc973-99fd-42ab-99a3-1009ad011d10" containerID="43229f4f46a681db3861c7dd7cc1ddbef5e4553a4c6b2d0f494cb8f90dc5ad8e" exitCode=0 Jan 21 15:52:07 crc kubenswrapper[4773]: I0121 15:52:07.078623 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" event={"ID":"597cc973-99fd-42ab-99a3-1009ad011d10","Type":"ContainerDied","Data":"43229f4f46a681db3861c7dd7cc1ddbef5e4553a4c6b2d0f494cb8f90dc5ad8e"} Jan 21 15:52:07 crc kubenswrapper[4773]: I0121 15:52:07.419092 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e85c33-f0be-4571-b131-5991f5ae6979" path="/var/lib/kubelet/pods/83e85c33-f0be-4571-b131-5991f5ae6979/volumes" Jan 21 15:52:08 crc kubenswrapper[4773]: I0121 15:52:08.092769 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" event={"ID":"597cc973-99fd-42ab-99a3-1009ad011d10","Type":"ContainerStarted","Data":"b2d5c8ce3c9c10c639be5eab91cdacdf5211a923872969ab0b2c4dd35c44c24e"} Jan 21 15:52:08 crc kubenswrapper[4773]: I0121 15:52:08.092995 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:08 crc kubenswrapper[4773]: I0121 15:52:08.114342 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" podStartSLOduration=4.114322693 podStartE2EDuration="4.114322693s" podCreationTimestamp="2026-01-21 15:52:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:08.111381543 +0000 UTC m=+1693.035871165" watchObservedRunningTime="2026-01-21 15:52:08.114322693 +0000 UTC m=+1693.038812315" Jan 21 15:52:11 crc kubenswrapper[4773]: I0121 15:52:11.772294 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 21 15:52:15 crc kubenswrapper[4773]: I0121 15:52:15.231986 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-sm9b8" Jan 21 15:52:15 crc kubenswrapper[4773]: I0121 15:52:15.301534 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-ll9rg"] Jan 21 15:52:15 crc kubenswrapper[4773]: I0121 15:52:15.301869 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" podUID="a95e0438-f08e-4f11-854a-cd5d64f30408" containerName="dnsmasq-dns" containerID="cri-o://5d1fb24672addedf9303e87ec2737eedeb84eea278c968997c9dcbb5d6ac5c08" gracePeriod=10 Jan 21 15:52:15 crc kubenswrapper[4773]: I0121 15:52:15.393040 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:52:15 crc kubenswrapper[4773]: E0121 15:52:15.393338 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:52:15 crc kubenswrapper[4773]: I0121 15:52:15.438653 4773 scope.go:117] "RemoveContainer" containerID="100540030afd95c37d49ec2b595b7be762f7d926f3f01ca5f118870dc6d7d9ef" Jan 21 15:52:15 crc kubenswrapper[4773]: I0121 15:52:15.598337 4773 scope.go:117] "RemoveContainer" containerID="3f99f3742eccde965c85b8c63b109f6acf3d18e10c52777ce3abb069cdac81f8" Jan 21 15:52:15 crc kubenswrapper[4773]: I0121 15:52:15.637310 4773 scope.go:117] "RemoveContainer" containerID="47688de5c00552e762952c1ff7a68d39a9f5a9d91ab0442f7c478ce771a4a42e" Jan 21 15:52:15 crc kubenswrapper[4773]: I0121 15:52:15.711215 4773 scope.go:117] "RemoveContainer" containerID="5b8053e13b140d3e520fc9a9279fb3a235af358f3f6b88ff28f0e527dd6af165" Jan 21 15:52:15 crc kubenswrapper[4773]: I0121 15:52:15.752451 4773 scope.go:117] "RemoveContainer" containerID="a5b365f3f16fc1d1d7ebc437e17f6ecd97b9068825e8729cf40a028822d55544" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.186406 4773 generic.go:334] "Generic (PLEG): container finished" podID="a95e0438-f08e-4f11-854a-cd5d64f30408" containerID="5d1fb24672addedf9303e87ec2737eedeb84eea278c968997c9dcbb5d6ac5c08" exitCode=0 Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.186473 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" event={"ID":"a95e0438-f08e-4f11-854a-cd5d64f30408","Type":"ContainerDied","Data":"5d1fb24672addedf9303e87ec2737eedeb84eea278c968997c9dcbb5d6ac5c08"} Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.574065 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.684814 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grtz9\" (UniqueName: \"kubernetes.io/projected/a95e0438-f08e-4f11-854a-cd5d64f30408-kube-api-access-grtz9\") pod \"a95e0438-f08e-4f11-854a-cd5d64f30408\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.685009 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-swift-storage-0\") pod \"a95e0438-f08e-4f11-854a-cd5d64f30408\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.685094 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-nb\") pod \"a95e0438-f08e-4f11-854a-cd5d64f30408\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.685125 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-config\") pod \"a95e0438-f08e-4f11-854a-cd5d64f30408\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.685174 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-svc\") pod \"a95e0438-f08e-4f11-854a-cd5d64f30408\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.685225 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-openstack-edpm-ipam\") pod \"a95e0438-f08e-4f11-854a-cd5d64f30408\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.685260 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-sb\") pod \"a95e0438-f08e-4f11-854a-cd5d64f30408\" (UID: \"a95e0438-f08e-4f11-854a-cd5d64f30408\") " Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.696765 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a95e0438-f08e-4f11-854a-cd5d64f30408-kube-api-access-grtz9" (OuterVolumeSpecName: "kube-api-access-grtz9") pod "a95e0438-f08e-4f11-854a-cd5d64f30408" (UID: "a95e0438-f08e-4f11-854a-cd5d64f30408"). InnerVolumeSpecName "kube-api-access-grtz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.764683 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a95e0438-f08e-4f11-854a-cd5d64f30408" (UID: "a95e0438-f08e-4f11-854a-cd5d64f30408"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.766527 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a95e0438-f08e-4f11-854a-cd5d64f30408" (UID: "a95e0438-f08e-4f11-854a-cd5d64f30408"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.767423 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-config" (OuterVolumeSpecName: "config") pod "a95e0438-f08e-4f11-854a-cd5d64f30408" (UID: "a95e0438-f08e-4f11-854a-cd5d64f30408"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.772355 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a95e0438-f08e-4f11-854a-cd5d64f30408" (UID: "a95e0438-f08e-4f11-854a-cd5d64f30408"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.777481 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a95e0438-f08e-4f11-854a-cd5d64f30408" (UID: "a95e0438-f08e-4f11-854a-cd5d64f30408"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.789029 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.789071 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-config\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.789086 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.789100 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.789111 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grtz9\" (UniqueName: \"kubernetes.io/projected/a95e0438-f08e-4f11-854a-cd5d64f30408-kube-api-access-grtz9\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.789124 4773 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.792539 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a95e0438-f08e-4f11-854a-cd5d64f30408" (UID: "a95e0438-f08e-4f11-854a-cd5d64f30408"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:52:16 crc kubenswrapper[4773]: I0121 15:52:16.891215 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a95e0438-f08e-4f11-854a-cd5d64f30408-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:17 crc kubenswrapper[4773]: I0121 15:52:17.199004 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" event={"ID":"a95e0438-f08e-4f11-854a-cd5d64f30408","Type":"ContainerDied","Data":"ebfa282abc95aeb271385ee7a2c02eee5043def765976f6f09045755674232ff"} Jan 21 15:52:17 crc kubenswrapper[4773]: I0121 15:52:17.199057 4773 scope.go:117] "RemoveContainer" containerID="5d1fb24672addedf9303e87ec2737eedeb84eea278c968997c9dcbb5d6ac5c08" Jan 21 15:52:17 crc kubenswrapper[4773]: I0121 15:52:17.199078 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-ll9rg" Jan 21 15:52:17 crc kubenswrapper[4773]: I0121 15:52:17.235295 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-ll9rg"] Jan 21 15:52:17 crc kubenswrapper[4773]: I0121 15:52:17.241121 4773 scope.go:117] "RemoveContainer" containerID="4fa644aaf140021cff9932968044c454cc956b55d18d872b62420cae5261da5c" Jan 21 15:52:17 crc kubenswrapper[4773]: I0121 15:52:17.248241 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-ll9rg"] Jan 21 15:52:17 crc kubenswrapper[4773]: I0121 15:52:17.407195 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a95e0438-f08e-4f11-854a-cd5d64f30408" path="/var/lib/kubelet/pods/a95e0438-f08e-4f11-854a-cd5d64f30408/volumes" Jan 21 15:52:26 crc kubenswrapper[4773]: I0121 15:52:26.383515 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:52:26 crc kubenswrapper[4773]: E0121 15:52:26.384292 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.468557 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx"] Jan 21 15:52:27 crc kubenswrapper[4773]: E0121 15:52:27.469265 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a95e0438-f08e-4f11-854a-cd5d64f30408" containerName="dnsmasq-dns" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.469278 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a95e0438-f08e-4f11-854a-cd5d64f30408" containerName="dnsmasq-dns" Jan 21 15:52:27 crc kubenswrapper[4773]: E0121 15:52:27.469289 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e85c33-f0be-4571-b131-5991f5ae6979" containerName="dnsmasq-dns" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.469296 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e85c33-f0be-4571-b131-5991f5ae6979" containerName="dnsmasq-dns" Jan 21 15:52:27 crc kubenswrapper[4773]: E0121 15:52:27.469336 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a95e0438-f08e-4f11-854a-cd5d64f30408" containerName="init" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.469342 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a95e0438-f08e-4f11-854a-cd5d64f30408" containerName="init" Jan 21 15:52:27 crc kubenswrapper[4773]: E0121 15:52:27.469350 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e85c33-f0be-4571-b131-5991f5ae6979" containerName="init" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.469357 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e85c33-f0be-4571-b131-5991f5ae6979" containerName="init" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.469538 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a95e0438-f08e-4f11-854a-cd5d64f30408" containerName="dnsmasq-dns" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.469553 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e85c33-f0be-4571-b131-5991f5ae6979" containerName="dnsmasq-dns" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.470363 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.473851 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.473933 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.473978 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.474199 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.488618 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx"] Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.514537 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsjm9\" (UniqueName: \"kubernetes.io/projected/246d4cbd-1a69-4a02-99ab-f716001d4e67-kube-api-access-fsjm9\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.514685 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.514768 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.514953 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.616365 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.616514 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.616583 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsjm9\" (UniqueName: \"kubernetes.io/projected/246d4cbd-1a69-4a02-99ab-f716001d4e67-kube-api-access-fsjm9\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.616648 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.632414 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.632846 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.646484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.656413 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsjm9\" (UniqueName: \"kubernetes.io/projected/246d4cbd-1a69-4a02-99ab-f716001d4e67-kube-api-access-fsjm9\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:27 crc kubenswrapper[4773]: I0121 15:52:27.808223 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:28 crc kubenswrapper[4773]: I0121 15:52:28.740993 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx"] Jan 21 15:52:29 crc kubenswrapper[4773]: I0121 15:52:29.321244 4773 generic.go:334] "Generic (PLEG): container finished" podID="c80bcc16-be97-47d8-afb6-7a1378546882" containerID="9889d7352e5247f9a7978e1e95d718db849a0ebb358eb063433db34e38fb1e9d" exitCode=0 Jan 21 15:52:29 crc kubenswrapper[4773]: I0121 15:52:29.321348 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c80bcc16-be97-47d8-afb6-7a1378546882","Type":"ContainerDied","Data":"9889d7352e5247f9a7978e1e95d718db849a0ebb358eb063433db34e38fb1e9d"} Jan 21 15:52:29 crc kubenswrapper[4773]: I0121 15:52:29.323332 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" event={"ID":"246d4cbd-1a69-4a02-99ab-f716001d4e67","Type":"ContainerStarted","Data":"aa8f09519f373a06d0beaf21b038a5c252018f8c23e01925731711941d83ab46"} Jan 21 15:52:30 crc kubenswrapper[4773]: I0121 15:52:30.356473 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c80bcc16-be97-47d8-afb6-7a1378546882","Type":"ContainerStarted","Data":"c4e5417a44fd58ca07dc0451e901c250a07d7f3a2f41ab9cb4f475961c1ba2e4"} Jan 21 15:52:30 crc kubenswrapper[4773]: I0121 15:52:30.358487 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 21 15:52:30 crc kubenswrapper[4773]: I0121 15:52:30.403052 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.403030504 podStartE2EDuration="38.403030504s" podCreationTimestamp="2026-01-21 15:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:30.397777951 +0000 UTC m=+1715.322267573" watchObservedRunningTime="2026-01-21 15:52:30.403030504 +0000 UTC m=+1715.327520126" Jan 21 15:52:30 crc kubenswrapper[4773]: I0121 15:52:30.740026 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Jan 21 15:52:31 crc kubenswrapper[4773]: I0121 15:52:31.376023 4773 generic.go:334] "Generic (PLEG): container finished" podID="13e55b8e-491d-4d97-a0cf-56433eb4a7f1" containerID="561322339e80a6a7b73e56cf7579fa96a7a52312a92a8c2631e287fb763265c9" exitCode=0 Jan 21 15:52:31 crc kubenswrapper[4773]: I0121 15:52:31.376101 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13e55b8e-491d-4d97-a0cf-56433eb4a7f1","Type":"ContainerDied","Data":"561322339e80a6a7b73e56cf7579fa96a7a52312a92a8c2631e287fb763265c9"} Jan 21 15:52:32 crc kubenswrapper[4773]: I0121 15:52:32.395388 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13e55b8e-491d-4d97-a0cf-56433eb4a7f1","Type":"ContainerStarted","Data":"2621e3abe4205ff97743481d6219ddfddfa95990efd9f928fc0834f57295e36c"} Jan 21 15:52:32 crc kubenswrapper[4773]: I0121 15:52:32.396809 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:52:32 crc kubenswrapper[4773]: I0121 15:52:32.426556 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.426535443 podStartE2EDuration="37.426535443s" podCreationTimestamp="2026-01-21 15:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:52:32.423255594 +0000 UTC m=+1717.347745216" watchObservedRunningTime="2026-01-21 15:52:32.426535443 +0000 UTC m=+1717.351025065" Jan 21 15:52:40 crc kubenswrapper[4773]: I0121 15:52:40.384545 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:52:40 crc kubenswrapper[4773]: E0121 15:52:40.386298 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:52:42 crc kubenswrapper[4773]: I0121 15:52:42.498237 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" event={"ID":"246d4cbd-1a69-4a02-99ab-f716001d4e67","Type":"ContainerStarted","Data":"2972419e31d7910cd5098fd47d7178a5acd49e38ad8331cbbe45f516a7a23f18"} Jan 21 15:52:42 crc kubenswrapper[4773]: I0121 15:52:42.532966 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" podStartSLOduration=2.701244769 podStartE2EDuration="15.532942584s" podCreationTimestamp="2026-01-21 15:52:27 +0000 UTC" firstStartedPulling="2026-01-21 15:52:28.748687865 +0000 UTC m=+1713.673177487" lastFinishedPulling="2026-01-21 15:52:41.58038569 +0000 UTC m=+1726.504875302" observedRunningTime="2026-01-21 15:52:42.528342429 +0000 UTC m=+1727.452832051" watchObservedRunningTime="2026-01-21 15:52:42.532942584 +0000 UTC m=+1727.457432206" Jan 21 15:52:43 crc kubenswrapper[4773]: I0121 15:52:43.566271 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c80bcc16-be97-47d8-afb6-7a1378546882" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.242:5671: connect: connection refused" Jan 21 15:52:46 crc kubenswrapper[4773]: I0121 15:52:46.180214 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="13e55b8e-491d-4d97-a0cf-56433eb4a7f1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.245:5671: connect: connection refused" Jan 21 15:52:53 crc kubenswrapper[4773]: I0121 15:52:53.384003 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:52:53 crc kubenswrapper[4773]: E0121 15:52:53.384919 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:52:53 crc kubenswrapper[4773]: I0121 15:52:53.564374 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 21 15:52:56 crc kubenswrapper[4773]: I0121 15:52:56.179439 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 21 15:52:56 crc kubenswrapper[4773]: I0121 15:52:56.641182 4773 generic.go:334] "Generic (PLEG): container finished" podID="246d4cbd-1a69-4a02-99ab-f716001d4e67" containerID="2972419e31d7910cd5098fd47d7178a5acd49e38ad8331cbbe45f516a7a23f18" exitCode=0 Jan 21 15:52:56 crc kubenswrapper[4773]: I0121 15:52:56.641307 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" event={"ID":"246d4cbd-1a69-4a02-99ab-f716001d4e67","Type":"ContainerDied","Data":"2972419e31d7910cd5098fd47d7178a5acd49e38ad8331cbbe45f516a7a23f18"} Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.313853 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.400467 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-ssh-key-openstack-edpm-ipam\") pod \"246d4cbd-1a69-4a02-99ab-f716001d4e67\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.400577 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-repo-setup-combined-ca-bundle\") pod \"246d4cbd-1a69-4a02-99ab-f716001d4e67\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.400982 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-inventory\") pod \"246d4cbd-1a69-4a02-99ab-f716001d4e67\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.401065 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsjm9\" (UniqueName: \"kubernetes.io/projected/246d4cbd-1a69-4a02-99ab-f716001d4e67-kube-api-access-fsjm9\") pod \"246d4cbd-1a69-4a02-99ab-f716001d4e67\" (UID: \"246d4cbd-1a69-4a02-99ab-f716001d4e67\") " Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.406912 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/246d4cbd-1a69-4a02-99ab-f716001d4e67-kube-api-access-fsjm9" (OuterVolumeSpecName: "kube-api-access-fsjm9") pod "246d4cbd-1a69-4a02-99ab-f716001d4e67" (UID: "246d4cbd-1a69-4a02-99ab-f716001d4e67"). InnerVolumeSpecName "kube-api-access-fsjm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.416242 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "246d4cbd-1a69-4a02-99ab-f716001d4e67" (UID: "246d4cbd-1a69-4a02-99ab-f716001d4e67"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.440218 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "246d4cbd-1a69-4a02-99ab-f716001d4e67" (UID: "246d4cbd-1a69-4a02-99ab-f716001d4e67"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.448019 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-inventory" (OuterVolumeSpecName: "inventory") pod "246d4cbd-1a69-4a02-99ab-f716001d4e67" (UID: "246d4cbd-1a69-4a02-99ab-f716001d4e67"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.507263 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.507300 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsjm9\" (UniqueName: \"kubernetes.io/projected/246d4cbd-1a69-4a02-99ab-f716001d4e67-kube-api-access-fsjm9\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.507311 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.507320 4773 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d4cbd-1a69-4a02-99ab-f716001d4e67-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.674057 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" event={"ID":"246d4cbd-1a69-4a02-99ab-f716001d4e67","Type":"ContainerDied","Data":"aa8f09519f373a06d0beaf21b038a5c252018f8c23e01925731711941d83ab46"} Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.674106 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa8f09519f373a06d0beaf21b038a5c252018f8c23e01925731711941d83ab46" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.674186 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.759686 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d"] Jan 21 15:52:58 crc kubenswrapper[4773]: E0121 15:52:58.761098 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246d4cbd-1a69-4a02-99ab-f716001d4e67" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.761144 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="246d4cbd-1a69-4a02-99ab-f716001d4e67" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.761787 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="246d4cbd-1a69-4a02-99ab-f716001d4e67" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.763324 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.766058 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.766196 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.766942 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.767146 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.771425 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d"] Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.817230 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr6mm\" (UniqueName: \"kubernetes.io/projected/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-kube-api-access-tr6mm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pn58d\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.817483 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pn58d\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.817882 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pn58d\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:52:58 crc kubenswrapper[4773]: E0121 15:52:58.883033 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod246d4cbd_1a69_4a02_99ab_f716001d4e67.slice/crio-aa8f09519f373a06d0beaf21b038a5c252018f8c23e01925731711941d83ab46\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod246d4cbd_1a69_4a02_99ab_f716001d4e67.slice\": RecentStats: unable to find data in memory cache]" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.920122 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pn58d\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.920245 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr6mm\" (UniqueName: \"kubernetes.io/projected/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-kube-api-access-tr6mm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pn58d\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.920341 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pn58d\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.924801 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pn58d\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.929012 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pn58d\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:52:58 crc kubenswrapper[4773]: I0121 15:52:58.939532 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr6mm\" (UniqueName: \"kubernetes.io/projected/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-kube-api-access-tr6mm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pn58d\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:52:59 crc kubenswrapper[4773]: I0121 15:52:59.124755 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:52:59 crc kubenswrapper[4773]: I0121 15:52:59.739527 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d"] Jan 21 15:53:00 crc kubenswrapper[4773]: I0121 15:53:00.703008 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" event={"ID":"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17","Type":"ContainerStarted","Data":"63eaf218f11bd4c4b5c538abc4c7618bf855620331663b548bb241e5d8e7425a"} Jan 21 15:53:00 crc kubenswrapper[4773]: I0121 15:53:00.703342 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" event={"ID":"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17","Type":"ContainerStarted","Data":"a9b80d4cfefb21ab1a823fa3173216e23f353566f4659ad4841ddb33dff05c90"} Jan 21 15:53:00 crc kubenswrapper[4773]: I0121 15:53:00.731272 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" podStartSLOduration=2.317046004 podStartE2EDuration="2.731226992s" podCreationTimestamp="2026-01-21 15:52:58 +0000 UTC" firstStartedPulling="2026-01-21 15:52:59.751543117 +0000 UTC m=+1744.676032739" lastFinishedPulling="2026-01-21 15:53:00.165724105 +0000 UTC m=+1745.090213727" observedRunningTime="2026-01-21 15:53:00.718585567 +0000 UTC m=+1745.643075199" watchObservedRunningTime="2026-01-21 15:53:00.731226992 +0000 UTC m=+1745.655716614" Jan 21 15:53:03 crc kubenswrapper[4773]: I0121 15:53:03.760646 4773 generic.go:334] "Generic (PLEG): container finished" podID="c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17" containerID="63eaf218f11bd4c4b5c538abc4c7618bf855620331663b548bb241e5d8e7425a" exitCode=0 Jan 21 15:53:03 crc kubenswrapper[4773]: I0121 15:53:03.760734 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" event={"ID":"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17","Type":"ContainerDied","Data":"63eaf218f11bd4c4b5c538abc4c7618bf855620331663b548bb241e5d8e7425a"} Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.327397 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.470076 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-ssh-key-openstack-edpm-ipam\") pod \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.470231 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr6mm\" (UniqueName: \"kubernetes.io/projected/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-kube-api-access-tr6mm\") pod \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.470443 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-inventory\") pod \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\" (UID: \"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17\") " Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.504160 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-kube-api-access-tr6mm" (OuterVolumeSpecName: "kube-api-access-tr6mm") pod "c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17" (UID: "c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17"). InnerVolumeSpecName "kube-api-access-tr6mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.530803 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17" (UID: "c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.536862 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-inventory" (OuterVolumeSpecName: "inventory") pod "c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17" (UID: "c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.573895 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.573935 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr6mm\" (UniqueName: \"kubernetes.io/projected/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-kube-api-access-tr6mm\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.573944 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.786039 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" event={"ID":"c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17","Type":"ContainerDied","Data":"a9b80d4cfefb21ab1a823fa3173216e23f353566f4659ad4841ddb33dff05c90"} Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.786127 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b80d4cfefb21ab1a823fa3173216e23f353566f4659ad4841ddb33dff05c90" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.786074 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pn58d" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.883788 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6"] Jan 21 15:53:05 crc kubenswrapper[4773]: E0121 15:53:05.884404 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.884483 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.884770 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.885938 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.890646 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.890894 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.891083 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.891773 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.918486 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6"] Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.982980 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.983048 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnl24\" (UniqueName: \"kubernetes.io/projected/449961cf-fcdf-4c35-9387-c3055f3364cb-kube-api-access-nnl24\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.983099 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:05 crc kubenswrapper[4773]: I0121 15:53:05.983170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:06 crc kubenswrapper[4773]: I0121 15:53:06.085662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:06 crc kubenswrapper[4773]: I0121 15:53:06.085757 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnl24\" (UniqueName: \"kubernetes.io/projected/449961cf-fcdf-4c35-9387-c3055f3364cb-kube-api-access-nnl24\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:06 crc kubenswrapper[4773]: I0121 15:53:06.085799 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:06 crc kubenswrapper[4773]: I0121 15:53:06.085863 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:06 crc kubenswrapper[4773]: I0121 15:53:06.091315 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:06 crc kubenswrapper[4773]: I0121 15:53:06.091324 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:06 crc kubenswrapper[4773]: I0121 15:53:06.092212 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:06 crc kubenswrapper[4773]: I0121 15:53:06.105548 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnl24\" (UniqueName: \"kubernetes.io/projected/449961cf-fcdf-4c35-9387-c3055f3364cb-kube-api-access-nnl24\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:06 crc kubenswrapper[4773]: I0121 15:53:06.208374 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:53:06 crc kubenswrapper[4773]: I0121 15:53:06.799650 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6"] Jan 21 15:53:07 crc kubenswrapper[4773]: I0121 15:53:07.807655 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" event={"ID":"449961cf-fcdf-4c35-9387-c3055f3364cb","Type":"ContainerStarted","Data":"60948a94adb81ba344f751f022007fe85634c289679c47836e9c87eb59a5d5b8"} Jan 21 15:53:07 crc kubenswrapper[4773]: I0121 15:53:07.808658 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" event={"ID":"449961cf-fcdf-4c35-9387-c3055f3364cb","Type":"ContainerStarted","Data":"a1bad02eba1109c1cea57843b7ff5386c078690c931c94b874a26d4c6e2e5708"} Jan 21 15:53:07 crc kubenswrapper[4773]: I0121 15:53:07.833883 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" podStartSLOduration=2.319945417 podStartE2EDuration="2.83386411s" podCreationTimestamp="2026-01-21 15:53:05 +0000 UTC" firstStartedPulling="2026-01-21 15:53:06.804148953 +0000 UTC m=+1751.728638575" lastFinishedPulling="2026-01-21 15:53:07.318067656 +0000 UTC m=+1752.242557268" observedRunningTime="2026-01-21 15:53:07.830539661 +0000 UTC m=+1752.755029283" watchObservedRunningTime="2026-01-21 15:53:07.83386411 +0000 UTC m=+1752.758353732" Jan 21 15:53:08 crc kubenswrapper[4773]: I0121 15:53:08.385186 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:53:08 crc kubenswrapper[4773]: E0121 15:53:08.385509 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:53:16 crc kubenswrapper[4773]: I0121 15:53:16.146678 4773 scope.go:117] "RemoveContainer" containerID="776ffd65e5f9d25fb0aabd5932a286dc6c16b7cc45cc3b53741f73dc28f94960" Jan 21 15:53:16 crc kubenswrapper[4773]: I0121 15:53:16.176688 4773 scope.go:117] "RemoveContainer" containerID="3f780e2e7efd070f6ca156eac46e0794a60b1878fd3c276a4f4c925e35aaa3f3" Jan 21 15:53:16 crc kubenswrapper[4773]: I0121 15:53:16.227292 4773 scope.go:117] "RemoveContainer" containerID="13c5a6b56405fe73ba002bbbd2f97415535a72bb99eeb4ff7abe02d50d331d0c" Jan 21 15:53:16 crc kubenswrapper[4773]: I0121 15:53:16.288555 4773 scope.go:117] "RemoveContainer" containerID="c88e906fd599d1127b95fa47a2a6334a9ee19bdcf5e06009f648515a6e4d00a3" Jan 21 15:53:16 crc kubenswrapper[4773]: I0121 15:53:16.314505 4773 scope.go:117] "RemoveContainer" containerID="f69df6d0719fa4a66cc816f3d5eddd9062642cfcc58665312fd7ed662bb9db07" Jan 21 15:53:21 crc kubenswrapper[4773]: I0121 15:53:21.384879 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:53:21 crc kubenswrapper[4773]: E0121 15:53:21.385655 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:53:33 crc kubenswrapper[4773]: I0121 15:53:33.384555 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:53:33 crc kubenswrapper[4773]: E0121 15:53:33.385340 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:53:48 crc kubenswrapper[4773]: I0121 15:53:48.384967 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:53:48 crc kubenswrapper[4773]: E0121 15:53:48.386496 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:54:00 crc kubenswrapper[4773]: I0121 15:54:00.383673 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:54:00 crc kubenswrapper[4773]: E0121 15:54:00.384523 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:54:15 crc kubenswrapper[4773]: I0121 15:54:15.390907 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:54:15 crc kubenswrapper[4773]: E0121 15:54:15.391716 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:54:16 crc kubenswrapper[4773]: I0121 15:54:16.478876 4773 scope.go:117] "RemoveContainer" containerID="f2c7b6ef123f16d70ab794a89c465b4d07965eab16e83e431aa8d70489a42d31" Jan 21 15:54:29 crc kubenswrapper[4773]: I0121 15:54:29.384597 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:54:29 crc kubenswrapper[4773]: E0121 15:54:29.385457 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:54:43 crc kubenswrapper[4773]: I0121 15:54:43.383792 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:54:43 crc kubenswrapper[4773]: E0121 15:54:43.384546 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:54:52 crc kubenswrapper[4773]: I0121 15:54:52.681312 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" podUID="22988650-1474-4ba4-a6c0-2deb003ae3e7" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 21 15:54:57 crc kubenswrapper[4773]: I0121 15:54:57.384459 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:54:57 crc kubenswrapper[4773]: E0121 15:54:57.385021 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:55:12 crc kubenswrapper[4773]: I0121 15:55:12.383674 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:55:12 crc kubenswrapper[4773]: E0121 15:55:12.384466 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:55:24 crc kubenswrapper[4773]: I0121 15:55:24.384193 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:55:24 crc kubenswrapper[4773]: E0121 15:55:24.385035 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:55:27 crc kubenswrapper[4773]: I0121 15:55:27.048158 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1880-account-create-update-jh658"] Jan 21 15:55:27 crc kubenswrapper[4773]: I0121 15:55:27.058811 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1880-account-create-update-jh658"] Jan 21 15:55:27 crc kubenswrapper[4773]: I0121 15:55:27.407192 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36f06063-64e0-4270-9b6e-258104f23d0a" path="/var/lib/kubelet/pods/36f06063-64e0-4270-9b6e-258104f23d0a/volumes" Jan 21 15:55:28 crc kubenswrapper[4773]: I0121 15:55:28.042491 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bba4-account-create-update-d28hs"] Jan 21 15:55:28 crc kubenswrapper[4773]: I0121 15:55:28.055003 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bba4-account-create-update-d28hs"] Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.032406 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-hsgk4"] Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.045548 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-49fq5"] Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.055566 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-dp4b6"] Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.064730 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-836e-account-create-update-vr956"] Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.073644 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-836e-account-create-update-vr956"] Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.082193 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-dp4b6"] Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.090734 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-hsgk4"] Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.100828 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-49fq5"] Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.395924 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2851f3fb-2572-4f19-be86-4771b3b33b06" path="/var/lib/kubelet/pods/2851f3fb-2572-4f19-be86-4771b3b33b06/volumes" Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.399290 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5feba328-ee96-4d3d-8654-c70374332b17" path="/var/lib/kubelet/pods/5feba328-ee96-4d3d-8654-c70374332b17/volumes" Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.400005 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1c38ae4-a741-4aaf-9f63-9f2384ad30a3" path="/var/lib/kubelet/pods/b1c38ae4-a741-4aaf-9f63-9f2384ad30a3/volumes" Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.400627 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd694032-4f44-44b3-b920-83c2cae0bcb5" path="/var/lib/kubelet/pods/dd694032-4f44-44b3-b920-83c2cae0bcb5/volumes" Jan 21 15:55:29 crc kubenswrapper[4773]: I0121 15:55:29.401740 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07cf5f7-a5cc-49e5-a90f-f49a75c395fe" path="/var/lib/kubelet/pods/f07cf5f7-a5cc-49e5-a90f-f49a75c395fe/volumes" Jan 21 15:55:35 crc kubenswrapper[4773]: I0121 15:55:35.390528 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:55:35 crc kubenswrapper[4773]: E0121 15:55:35.391345 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:55:37 crc kubenswrapper[4773]: I0121 15:55:37.031518 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6tg4p"] Jan 21 15:55:37 crc kubenswrapper[4773]: I0121 15:55:37.046517 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6x9bw"] Jan 21 15:55:37 crc kubenswrapper[4773]: I0121 15:55:37.056157 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6tg4p"] Jan 21 15:55:37 crc kubenswrapper[4773]: I0121 15:55:37.065826 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6x9bw"] Jan 21 15:55:37 crc kubenswrapper[4773]: I0121 15:55:37.395707 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71edab3a-2ae1-4703-a506-e2a278eb5542" path="/var/lib/kubelet/pods/71edab3a-2ae1-4703-a506-e2a278eb5542/volumes" Jan 21 15:55:37 crc kubenswrapper[4773]: I0121 15:55:37.644814 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f238aaa6-9768-4a13-b711-158160bfe40f" path="/var/lib/kubelet/pods/f238aaa6-9768-4a13-b711-158160bfe40f/volumes" Jan 21 15:55:39 crc kubenswrapper[4773]: I0121 15:55:39.028710 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-fkmzk"] Jan 21 15:55:39 crc kubenswrapper[4773]: I0121 15:55:39.040293 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-fkmzk"] Jan 21 15:55:39 crc kubenswrapper[4773]: I0121 15:55:39.396462 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab9197c-22f5-484b-b154-df64f7433d7d" path="/var/lib/kubelet/pods/1ab9197c-22f5-484b-b154-df64f7433d7d/volumes" Jan 21 15:55:42 crc kubenswrapper[4773]: I0121 15:55:42.033436 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hvg5f"] Jan 21 15:55:42 crc kubenswrapper[4773]: I0121 15:55:42.042937 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hvg5f"] Jan 21 15:55:43 crc kubenswrapper[4773]: I0121 15:55:43.413410 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46e304a8-01b0-46e8-85b0-d06af7a285c6" path="/var/lib/kubelet/pods/46e304a8-01b0-46e8-85b0-d06af7a285c6/volumes" Jan 21 15:55:46 crc kubenswrapper[4773]: I0121 15:55:46.037355 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a51f-account-create-update-5nnlx"] Jan 21 15:55:46 crc kubenswrapper[4773]: I0121 15:55:46.049065 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-141e-account-create-update-z9rr9"] Jan 21 15:55:46 crc kubenswrapper[4773]: I0121 15:55:46.062102 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a51f-account-create-update-5nnlx"] Jan 21 15:55:46 crc kubenswrapper[4773]: I0121 15:55:46.070934 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3d4e-account-create-update-zgrzh"] Jan 21 15:55:46 crc kubenswrapper[4773]: I0121 15:55:46.080649 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-8e6c-account-create-update-8642z"] Jan 21 15:55:46 crc kubenswrapper[4773]: I0121 15:55:46.091428 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-141e-account-create-update-z9rr9"] Jan 21 15:55:46 crc kubenswrapper[4773]: I0121 15:55:46.100850 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-8e6c-account-create-update-8642z"] Jan 21 15:55:46 crc kubenswrapper[4773]: I0121 15:55:46.110130 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3d4e-account-create-update-zgrzh"] Jan 21 15:55:47 crc kubenswrapper[4773]: I0121 15:55:47.384256 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:55:47 crc kubenswrapper[4773]: E0121 15:55:47.384521 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 15:55:47 crc kubenswrapper[4773]: I0121 15:55:47.403577 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30fc1d3b-2e18-449e-87be-3cab6a8668a1" path="/var/lib/kubelet/pods/30fc1d3b-2e18-449e-87be-3cab6a8668a1/volumes" Jan 21 15:55:47 crc kubenswrapper[4773]: I0121 15:55:47.404810 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46e701e1-3d35-4188-9fe4-8e25b7e0e99e" path="/var/lib/kubelet/pods/46e701e1-3d35-4188-9fe4-8e25b7e0e99e/volumes" Jan 21 15:55:47 crc kubenswrapper[4773]: I0121 15:55:47.406881 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0bf1bf-486e-40e7-80cb-4eff17210708" path="/var/lib/kubelet/pods/ce0bf1bf-486e-40e7-80cb-4eff17210708/volumes" Jan 21 15:55:47 crc kubenswrapper[4773]: I0121 15:55:47.407877 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e14d0772-1452-4862-ad42-7c992e1bc03a" path="/var/lib/kubelet/pods/e14d0772-1452-4862-ad42-7c992e1bc03a/volumes" Jan 21 15:55:59 crc kubenswrapper[4773]: I0121 15:55:59.384030 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:56:00 crc kubenswrapper[4773]: I0121 15:56:00.563319 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"17e7142fafa8e0349bab7f85c21294d342dcff1249216c34be3507a9473a3db2"} Jan 21 15:56:01 crc kubenswrapper[4773]: I0121 15:56:01.047457 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jdjh8"] Jan 21 15:56:01 crc kubenswrapper[4773]: I0121 15:56:01.074461 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jdjh8"] Jan 21 15:56:01 crc kubenswrapper[4773]: I0121 15:56:01.396150 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a667cd6a-52e3-4221-914f-c662638460d4" path="/var/lib/kubelet/pods/a667cd6a-52e3-4221-914f-c662638460d4/volumes" Jan 21 15:56:16 crc kubenswrapper[4773]: I0121 15:56:16.578725 4773 scope.go:117] "RemoveContainer" containerID="dbc0f2ed3adc05f3f27e1629547c8e8bb72187bf44a744660463774089332619" Jan 21 15:56:16 crc kubenswrapper[4773]: I0121 15:56:16.608046 4773 scope.go:117] "RemoveContainer" containerID="5240e1f81ee3532c8baa0e6abbfd543d782790cb19a1955ce46321378560da6c" Jan 21 15:56:16 crc kubenswrapper[4773]: I0121 15:56:16.652661 4773 scope.go:117] "RemoveContainer" containerID="57cf8caf877e0a309a7bc4f94ace397b71bb4574d97ff0ef185d300cb7c40feb" Jan 21 15:56:16 crc kubenswrapper[4773]: I0121 15:56:16.673563 4773 scope.go:117] "RemoveContainer" containerID="aa7619e4501048b3f9fc1c5340210df2fd925689278c2a084decd5c5c22e83df" Jan 21 15:56:16 crc kubenswrapper[4773]: I0121 15:56:16.703374 4773 scope.go:117] "RemoveContainer" containerID="b08ddcaa57f63148854f8a8f67ba469dd07c4ae0cfa306ebe0ef01dbf0eee70e" Jan 21 15:56:16 crc kubenswrapper[4773]: I0121 15:56:16.746135 4773 scope.go:117] "RemoveContainer" containerID="cbc2707f7fdb0d7f636dc71af38069483b71902b168c3d3bd004923dd2d0ba8c" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.461000 4773 scope.go:117] "RemoveContainer" containerID="af85c068de34a167ec78b7828382cc548a72139657ed074f4c80f80fdc5e3bfb" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.482621 4773 scope.go:117] "RemoveContainer" containerID="85434c8e08817287b2779eb6b6f7b88855f3e8c8178232d7dc65641ba461e6f4" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.514234 4773 scope.go:117] "RemoveContainer" containerID="1a0940fd4d2bdac5744603b18f5b76869251a3f3de035474d8f8c25a587b52ac" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.585077 4773 scope.go:117] "RemoveContainer" containerID="e07c31df794d0612bc360ba53bdab145e593b90d70d987c9422cc3936bf7e013" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.639856 4773 scope.go:117] "RemoveContainer" containerID="a1f7869f1fa2684a7a9bc185781c6e35baf7b9a82cf4ec4f6d2b6341afff411f" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.674813 4773 scope.go:117] "RemoveContainer" containerID="2c426a0de209f703f1ba7f615e457ed31887ba833d4198825084bde376ad6015" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.730468 4773 scope.go:117] "RemoveContainer" containerID="8adec2fc1a6166c1ee1684c9fdb6cd207fdbc00362a29c3fe4922b7abd97de24" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.781764 4773 scope.go:117] "RemoveContainer" containerID="e6a5c3f3bbc2a1b7ba8ef36d8cbc9676d13cedcfd2f5f436ecd0770f2dfd6731" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.802422 4773 scope.go:117] "RemoveContainer" containerID="d3bd8172c31f65cea2e49b259dc8168c9851e33b7b7a1c19002f635a9e1f7d4d" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.820431 4773 scope.go:117] "RemoveContainer" containerID="a295ec39e5596e593ac63cfb20df027f6dca2405c2a6623e54ad5399a7982e12" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.840104 4773 scope.go:117] "RemoveContainer" containerID="8d02e47ccecc0ffee1f8917b3077744d2c6dff83cf1b69b4bf6fb3c17250ac31" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.862392 4773 scope.go:117] "RemoveContainer" containerID="93cf2708082e8caf0dfe56c404b5db196a04b54f9c9b54bd63c98df7728122bd" Jan 21 15:56:18 crc kubenswrapper[4773]: I0121 15:56:18.881038 4773 scope.go:117] "RemoveContainer" containerID="2b6aff19116882f7e97b35a58aae8ec18f287a2317918a696aa0c51a819cca1b" Jan 21 15:56:34 crc kubenswrapper[4773]: I0121 15:56:34.917297 4773 generic.go:334] "Generic (PLEG): container finished" podID="449961cf-fcdf-4c35-9387-c3055f3364cb" containerID="60948a94adb81ba344f751f022007fe85634c289679c47836e9c87eb59a5d5b8" exitCode=0 Jan 21 15:56:34 crc kubenswrapper[4773]: I0121 15:56:34.917408 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" event={"ID":"449961cf-fcdf-4c35-9387-c3055f3364cb","Type":"ContainerDied","Data":"60948a94adb81ba344f751f022007fe85634c289679c47836e9c87eb59a5d5b8"} Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.422826 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.521058 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-ssh-key-openstack-edpm-ipam\") pod \"449961cf-fcdf-4c35-9387-c3055f3364cb\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.521172 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnl24\" (UniqueName: \"kubernetes.io/projected/449961cf-fcdf-4c35-9387-c3055f3364cb-kube-api-access-nnl24\") pod \"449961cf-fcdf-4c35-9387-c3055f3364cb\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.521227 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-inventory\") pod \"449961cf-fcdf-4c35-9387-c3055f3364cb\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.521364 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-bootstrap-combined-ca-bundle\") pod \"449961cf-fcdf-4c35-9387-c3055f3364cb\" (UID: \"449961cf-fcdf-4c35-9387-c3055f3364cb\") " Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.534952 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449961cf-fcdf-4c35-9387-c3055f3364cb-kube-api-access-nnl24" (OuterVolumeSpecName: "kube-api-access-nnl24") pod "449961cf-fcdf-4c35-9387-c3055f3364cb" (UID: "449961cf-fcdf-4c35-9387-c3055f3364cb"). InnerVolumeSpecName "kube-api-access-nnl24". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.537171 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "449961cf-fcdf-4c35-9387-c3055f3364cb" (UID: "449961cf-fcdf-4c35-9387-c3055f3364cb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.565076 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "449961cf-fcdf-4c35-9387-c3055f3364cb" (UID: "449961cf-fcdf-4c35-9387-c3055f3364cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.567965 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-inventory" (OuterVolumeSpecName: "inventory") pod "449961cf-fcdf-4c35-9387-c3055f3364cb" (UID: "449961cf-fcdf-4c35-9387-c3055f3364cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.623757 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.623799 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnl24\" (UniqueName: \"kubernetes.io/projected/449961cf-fcdf-4c35-9387-c3055f3364cb-kube-api-access-nnl24\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.623809 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.623820 4773 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449961cf-fcdf-4c35-9387-c3055f3364cb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.936474 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" event={"ID":"449961cf-fcdf-4c35-9387-c3055f3364cb","Type":"ContainerDied","Data":"a1bad02eba1109c1cea57843b7ff5386c078690c931c94b874a26d4c6e2e5708"} Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.936518 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6" Jan 21 15:56:36 crc kubenswrapper[4773]: I0121 15:56:36.936526 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1bad02eba1109c1cea57843b7ff5386c078690c931c94b874a26d4c6e2e5708" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.028968 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg"] Jan 21 15:56:37 crc kubenswrapper[4773]: E0121 15:56:37.029468 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449961cf-fcdf-4c35-9387-c3055f3364cb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.029492 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="449961cf-fcdf-4c35-9387-c3055f3364cb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.029732 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="449961cf-fcdf-4c35-9387-c3055f3364cb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.030408 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.038519 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.039385 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.039789 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.039924 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.065328 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg"] Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.136210 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqzx4\" (UniqueName: \"kubernetes.io/projected/3726d208-3122-4e0e-a802-7f9b0c59621c-kube-api-access-kqzx4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.136811 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.137380 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.239252 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.239351 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqzx4\" (UniqueName: \"kubernetes.io/projected/3726d208-3122-4e0e-a802-7f9b0c59621c-kube-api-access-kqzx4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.239380 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.255034 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.255043 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.258949 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqzx4\" (UniqueName: \"kubernetes.io/projected/3726d208-3122-4e0e-a802-7f9b0c59621c-kube-api-access-kqzx4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.354145 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.911217 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.912375 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg"] Jan 21 15:56:37 crc kubenswrapper[4773]: I0121 15:56:37.948856 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" event={"ID":"3726d208-3122-4e0e-a802-7f9b0c59621c","Type":"ContainerStarted","Data":"79c0f23f707329d889713da1650bffccf72fa93f5fb1397218429d7067d1d754"} Jan 21 15:56:39 crc kubenswrapper[4773]: I0121 15:56:39.972078 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" event={"ID":"3726d208-3122-4e0e-a802-7f9b0c59621c","Type":"ContainerStarted","Data":"c7dda10d840bd3000404c477740a654e3b4795f0a05514233ec94652e9d6ed39"} Jan 21 15:56:39 crc kubenswrapper[4773]: I0121 15:56:39.993289 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" podStartSLOduration=1.9667304859999999 podStartE2EDuration="2.993269968s" podCreationTimestamp="2026-01-21 15:56:37 +0000 UTC" firstStartedPulling="2026-01-21 15:56:37.910831274 +0000 UTC m=+1962.835320896" lastFinishedPulling="2026-01-21 15:56:38.937370766 +0000 UTC m=+1963.861860378" observedRunningTime="2026-01-21 15:56:39.984723468 +0000 UTC m=+1964.909213100" watchObservedRunningTime="2026-01-21 15:56:39.993269968 +0000 UTC m=+1964.917759590" Jan 21 15:56:49 crc kubenswrapper[4773]: I0121 15:56:49.065014 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-7vcf9"] Jan 21 15:56:49 crc kubenswrapper[4773]: I0121 15:56:49.076813 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-7vcf9"] Jan 21 15:56:49 crc kubenswrapper[4773]: I0121 15:56:49.395529 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee2313d-678e-487c-a4af-ae303d40bedd" path="/var/lib/kubelet/pods/3ee2313d-678e-487c-a4af-ae303d40bedd/volumes" Jan 21 15:57:07 crc kubenswrapper[4773]: I0121 15:57:07.032483 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-l79d2"] Jan 21 15:57:07 crc kubenswrapper[4773]: I0121 15:57:07.042927 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-l79d2"] Jan 21 15:57:07 crc kubenswrapper[4773]: I0121 15:57:07.396785 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a44a95-1489-4fc8-8cbc-8f82568dcdfd" path="/var/lib/kubelet/pods/f3a44a95-1489-4fc8-8cbc-8f82568dcdfd/volumes" Jan 21 15:57:08 crc kubenswrapper[4773]: I0121 15:57:08.047205 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6x5cx"] Jan 21 15:57:08 crc kubenswrapper[4773]: I0121 15:57:08.059746 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-6bhfp"] Jan 21 15:57:08 crc kubenswrapper[4773]: I0121 15:57:08.072885 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-6bhfp"] Jan 21 15:57:08 crc kubenswrapper[4773]: I0121 15:57:08.085136 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6x5cx"] Jan 21 15:57:09 crc kubenswrapper[4773]: I0121 15:57:09.396822 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eae6f1f-bc67-4acc-836b-68396e478669" path="/var/lib/kubelet/pods/3eae6f1f-bc67-4acc-836b-68396e478669/volumes" Jan 21 15:57:09 crc kubenswrapper[4773]: I0121 15:57:09.397818 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12aa4f9-2fe2-4bfd-b764-3755131eb10a" path="/var/lib/kubelet/pods/b12aa4f9-2fe2-4bfd-b764-3755131eb10a/volumes" Jan 21 15:57:13 crc kubenswrapper[4773]: I0121 15:57:13.029917 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-n2bxp"] Jan 21 15:57:13 crc kubenswrapper[4773]: I0121 15:57:13.040649 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-n2bxp"] Jan 21 15:57:13 crc kubenswrapper[4773]: I0121 15:57:13.396602 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd3f2b8-c365-4845-8508-2403d3b1f03e" path="/var/lib/kubelet/pods/3cd3f2b8-c365-4845-8508-2403d3b1f03e/volumes" Jan 21 15:57:19 crc kubenswrapper[4773]: I0121 15:57:19.148003 4773 scope.go:117] "RemoveContainer" containerID="7d86cea0536d0c13baaabd5485b8eaf52a9043d6004981b6d7a42e2aeaa3129d" Jan 21 15:57:19 crc kubenswrapper[4773]: I0121 15:57:19.200461 4773 scope.go:117] "RemoveContainer" containerID="c67b185dffa20fe1a4a76944ac60a9359ff42fdac69e7b491c802b48b2b8a22d" Jan 21 15:57:19 crc kubenswrapper[4773]: I0121 15:57:19.246493 4773 scope.go:117] "RemoveContainer" containerID="dd707d39e543364bfdf48d96c129e34d8eda0f48a81578d43c56fa9b9127e4e7" Jan 21 15:57:19 crc kubenswrapper[4773]: I0121 15:57:19.303257 4773 scope.go:117] "RemoveContainer" containerID="5d733e8c68f4ba219ff27620276d938a782447f1b4e38ce0f63ddc39f8b58720" Jan 21 15:57:19 crc kubenswrapper[4773]: I0121 15:57:19.378436 4773 scope.go:117] "RemoveContainer" containerID="3714d8c788885f1482f8ecf2d8a36732d8580ef97d5ff098dbfddd4e34f09975" Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.734903 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2qr74"] Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.738001 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.749661 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qr74"] Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.790603 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-catalog-content\") pod \"redhat-operators-2qr74\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.790815 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msscg\" (UniqueName: \"kubernetes.io/projected/79050442-5738-4186-bed8-aab99e80c76b-kube-api-access-msscg\") pod \"redhat-operators-2qr74\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.790927 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-utilities\") pod \"redhat-operators-2qr74\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.906538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-catalog-content\") pod \"redhat-operators-2qr74\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.906015 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-catalog-content\") pod \"redhat-operators-2qr74\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.907992 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msscg\" (UniqueName: \"kubernetes.io/projected/79050442-5738-4186-bed8-aab99e80c76b-kube-api-access-msscg\") pod \"redhat-operators-2qr74\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.908107 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-utilities\") pod \"redhat-operators-2qr74\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.908506 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-utilities\") pod \"redhat-operators-2qr74\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:21 crc kubenswrapper[4773]: I0121 15:57:21.939664 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msscg\" (UniqueName: \"kubernetes.io/projected/79050442-5738-4186-bed8-aab99e80c76b-kube-api-access-msscg\") pod \"redhat-operators-2qr74\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:22 crc kubenswrapper[4773]: I0121 15:57:22.206132 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:22 crc kubenswrapper[4773]: I0121 15:57:22.723828 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qr74"] Jan 21 15:57:23 crc kubenswrapper[4773]: I0121 15:57:23.451446 4773 generic.go:334] "Generic (PLEG): container finished" podID="79050442-5738-4186-bed8-aab99e80c76b" containerID="f8a4c8760dd8fbef245b68af584c506e457b4b4ab0af66cac1590e467742e7d5" exitCode=0 Jan 21 15:57:23 crc kubenswrapper[4773]: I0121 15:57:23.451503 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qr74" event={"ID":"79050442-5738-4186-bed8-aab99e80c76b","Type":"ContainerDied","Data":"f8a4c8760dd8fbef245b68af584c506e457b4b4ab0af66cac1590e467742e7d5"} Jan 21 15:57:23 crc kubenswrapper[4773]: I0121 15:57:23.452820 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qr74" event={"ID":"79050442-5738-4186-bed8-aab99e80c76b","Type":"ContainerStarted","Data":"7193ca502eda90a30c358fef0337915938ab20d616de4718ec36b0334db50823"} Jan 21 15:57:26 crc kubenswrapper[4773]: I0121 15:57:26.482275 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qr74" event={"ID":"79050442-5738-4186-bed8-aab99e80c76b","Type":"ContainerStarted","Data":"04894cb46d0980c6bb281a8c0011a911b5b110638c91eccbde15c89cd486163f"} Jan 21 15:57:32 crc kubenswrapper[4773]: I0121 15:57:32.039874 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-pdgr9"] Jan 21 15:57:32 crc kubenswrapper[4773]: I0121 15:57:32.048499 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-pdgr9"] Jan 21 15:57:33 crc kubenswrapper[4773]: I0121 15:57:33.397302 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd62e746-7c8e-4a74-a37e-3daa482a53ba" path="/var/lib/kubelet/pods/cd62e746-7c8e-4a74-a37e-3daa482a53ba/volumes" Jan 21 15:57:33 crc kubenswrapper[4773]: I0121 15:57:33.599427 4773 generic.go:334] "Generic (PLEG): container finished" podID="79050442-5738-4186-bed8-aab99e80c76b" containerID="04894cb46d0980c6bb281a8c0011a911b5b110638c91eccbde15c89cd486163f" exitCode=0 Jan 21 15:57:33 crc kubenswrapper[4773]: I0121 15:57:33.599492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qr74" event={"ID":"79050442-5738-4186-bed8-aab99e80c76b","Type":"ContainerDied","Data":"04894cb46d0980c6bb281a8c0011a911b5b110638c91eccbde15c89cd486163f"} Jan 21 15:57:35 crc kubenswrapper[4773]: I0121 15:57:35.622305 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qr74" event={"ID":"79050442-5738-4186-bed8-aab99e80c76b","Type":"ContainerStarted","Data":"b4062dca5802201adf99eb8c9bfc83ea65c21c5fc9b07491d2bae3f8f5cca273"} Jan 21 15:57:35 crc kubenswrapper[4773]: I0121 15:57:35.648644 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2qr74" podStartSLOduration=3.431060217 podStartE2EDuration="14.648625217s" podCreationTimestamp="2026-01-21 15:57:21 +0000 UTC" firstStartedPulling="2026-01-21 15:57:23.453852307 +0000 UTC m=+2008.378341929" lastFinishedPulling="2026-01-21 15:57:34.671417307 +0000 UTC m=+2019.595906929" observedRunningTime="2026-01-21 15:57:35.640947696 +0000 UTC m=+2020.565437318" watchObservedRunningTime="2026-01-21 15:57:35.648625217 +0000 UTC m=+2020.573114839" Jan 21 15:57:42 crc kubenswrapper[4773]: I0121 15:57:42.207219 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:42 crc kubenswrapper[4773]: I0121 15:57:42.207834 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:43 crc kubenswrapper[4773]: I0121 15:57:43.259173 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2qr74" podUID="79050442-5738-4186-bed8-aab99e80c76b" containerName="registry-server" probeResult="failure" output=< Jan 21 15:57:43 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 15:57:43 crc kubenswrapper[4773]: > Jan 21 15:57:45 crc kubenswrapper[4773]: I0121 15:57:45.038149 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-znjg2"] Jan 21 15:57:45 crc kubenswrapper[4773]: I0121 15:57:45.078223 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-znjg2"] Jan 21 15:57:45 crc kubenswrapper[4773]: I0121 15:57:45.398305 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c0beb28-481c-4507-94c2-d644e4faf5ab" path="/var/lib/kubelet/pods/5c0beb28-481c-4507-94c2-d644e4faf5ab/volumes" Jan 21 15:57:52 crc kubenswrapper[4773]: I0121 15:57:52.259483 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:52 crc kubenswrapper[4773]: I0121 15:57:52.316776 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:52 crc kubenswrapper[4773]: I0121 15:57:52.946934 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qr74"] Jan 21 15:57:53 crc kubenswrapper[4773]: I0121 15:57:53.791803 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2qr74" podUID="79050442-5738-4186-bed8-aab99e80c76b" containerName="registry-server" containerID="cri-o://b4062dca5802201adf99eb8c9bfc83ea65c21c5fc9b07491d2bae3f8f5cca273" gracePeriod=2 Jan 21 15:57:54 crc kubenswrapper[4773]: I0121 15:57:54.874050 4773 generic.go:334] "Generic (PLEG): container finished" podID="79050442-5738-4186-bed8-aab99e80c76b" containerID="b4062dca5802201adf99eb8c9bfc83ea65c21c5fc9b07491d2bae3f8f5cca273" exitCode=0 Jan 21 15:57:54 crc kubenswrapper[4773]: I0121 15:57:54.874417 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qr74" event={"ID":"79050442-5738-4186-bed8-aab99e80c76b","Type":"ContainerDied","Data":"b4062dca5802201adf99eb8c9bfc83ea65c21c5fc9b07491d2bae3f8f5cca273"} Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.137263 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.189136 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-utilities\") pod \"79050442-5738-4186-bed8-aab99e80c76b\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.189342 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msscg\" (UniqueName: \"kubernetes.io/projected/79050442-5738-4186-bed8-aab99e80c76b-kube-api-access-msscg\") pod \"79050442-5738-4186-bed8-aab99e80c76b\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.189380 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-catalog-content\") pod \"79050442-5738-4186-bed8-aab99e80c76b\" (UID: \"79050442-5738-4186-bed8-aab99e80c76b\") " Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.191439 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-utilities" (OuterVolumeSpecName: "utilities") pod "79050442-5738-4186-bed8-aab99e80c76b" (UID: "79050442-5738-4186-bed8-aab99e80c76b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.209478 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79050442-5738-4186-bed8-aab99e80c76b-kube-api-access-msscg" (OuterVolumeSpecName: "kube-api-access-msscg") pod "79050442-5738-4186-bed8-aab99e80c76b" (UID: "79050442-5738-4186-bed8-aab99e80c76b"). InnerVolumeSpecName "kube-api-access-msscg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.292773 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msscg\" (UniqueName: \"kubernetes.io/projected/79050442-5738-4186-bed8-aab99e80c76b-kube-api-access-msscg\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.292862 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.302969 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79050442-5738-4186-bed8-aab99e80c76b" (UID: "79050442-5738-4186-bed8-aab99e80c76b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.408085 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79050442-5738-4186-bed8-aab99e80c76b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.890544 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qr74" event={"ID":"79050442-5738-4186-bed8-aab99e80c76b","Type":"ContainerDied","Data":"7193ca502eda90a30c358fef0337915938ab20d616de4718ec36b0334db50823"} Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.890748 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qr74" Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.891106 4773 scope.go:117] "RemoveContainer" containerID="b4062dca5802201adf99eb8c9bfc83ea65c21c5fc9b07491d2bae3f8f5cca273" Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.923078 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qr74"] Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.925132 4773 scope.go:117] "RemoveContainer" containerID="04894cb46d0980c6bb281a8c0011a911b5b110638c91eccbde15c89cd486163f" Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.932508 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2qr74"] Jan 21 15:57:55 crc kubenswrapper[4773]: I0121 15:57:55.951135 4773 scope.go:117] "RemoveContainer" containerID="f8a4c8760dd8fbef245b68af584c506e457b4b4ab0af66cac1590e467742e7d5" Jan 21 15:57:57 crc kubenswrapper[4773]: I0121 15:57:57.395635 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79050442-5738-4186-bed8-aab99e80c76b" path="/var/lib/kubelet/pods/79050442-5738-4186-bed8-aab99e80c76b/volumes" Jan 21 15:58:19 crc kubenswrapper[4773]: I0121 15:58:19.513002 4773 scope.go:117] "RemoveContainer" containerID="06a517f5eb8b0202913474784c903634052c39116f05e4f7a356add716a82ac0" Jan 21 15:58:20 crc kubenswrapper[4773]: I0121 15:58:20.906672 4773 scope.go:117] "RemoveContainer" containerID="4db387886c94d9295b7d0446bd4a1606255259c595aabc38c5d6e12d844dc3b8" Jan 21 15:58:25 crc kubenswrapper[4773]: I0121 15:58:25.206162 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:58:25 crc kubenswrapper[4773]: I0121 15:58:25.207150 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:58:34 crc kubenswrapper[4773]: I0121 15:58:34.234908 4773 generic.go:334] "Generic (PLEG): container finished" podID="3726d208-3122-4e0e-a802-7f9b0c59621c" containerID="c7dda10d840bd3000404c477740a654e3b4795f0a05514233ec94652e9d6ed39" exitCode=0 Jan 21 15:58:34 crc kubenswrapper[4773]: I0121 15:58:34.234994 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" event={"ID":"3726d208-3122-4e0e-a802-7f9b0c59621c","Type":"ContainerDied","Data":"c7dda10d840bd3000404c477740a654e3b4795f0a05514233ec94652e9d6ed39"} Jan 21 15:58:35 crc kubenswrapper[4773]: I0121 15:58:35.773710 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:58:35 crc kubenswrapper[4773]: I0121 15:58:35.876426 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-inventory\") pod \"3726d208-3122-4e0e-a802-7f9b0c59621c\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " Jan 21 15:58:35 crc kubenswrapper[4773]: I0121 15:58:35.876570 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqzx4\" (UniqueName: \"kubernetes.io/projected/3726d208-3122-4e0e-a802-7f9b0c59621c-kube-api-access-kqzx4\") pod \"3726d208-3122-4e0e-a802-7f9b0c59621c\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " Jan 21 15:58:35 crc kubenswrapper[4773]: I0121 15:58:35.876947 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-ssh-key-openstack-edpm-ipam\") pod \"3726d208-3122-4e0e-a802-7f9b0c59621c\" (UID: \"3726d208-3122-4e0e-a802-7f9b0c59621c\") " Jan 21 15:58:35 crc kubenswrapper[4773]: I0121 15:58:35.882379 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3726d208-3122-4e0e-a802-7f9b0c59621c-kube-api-access-kqzx4" (OuterVolumeSpecName: "kube-api-access-kqzx4") pod "3726d208-3122-4e0e-a802-7f9b0c59621c" (UID: "3726d208-3122-4e0e-a802-7f9b0c59621c"). InnerVolumeSpecName "kube-api-access-kqzx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:58:35 crc kubenswrapper[4773]: I0121 15:58:35.909603 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-inventory" (OuterVolumeSpecName: "inventory") pod "3726d208-3122-4e0e-a802-7f9b0c59621c" (UID: "3726d208-3122-4e0e-a802-7f9b0c59621c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:35 crc kubenswrapper[4773]: I0121 15:58:35.909800 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3726d208-3122-4e0e-a802-7f9b0c59621c" (UID: "3726d208-3122-4e0e-a802-7f9b0c59621c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:58:35 crc kubenswrapper[4773]: I0121 15:58:35.979811 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:35 crc kubenswrapper[4773]: I0121 15:58:35.979851 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3726d208-3122-4e0e-a802-7f9b0c59621c-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:35 crc kubenswrapper[4773]: I0121 15:58:35.979861 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqzx4\" (UniqueName: \"kubernetes.io/projected/3726d208-3122-4e0e-a802-7f9b0c59621c-kube-api-access-kqzx4\") on node \"crc\" DevicePath \"\"" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.254278 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" event={"ID":"3726d208-3122-4e0e-a802-7f9b0c59621c","Type":"ContainerDied","Data":"79c0f23f707329d889713da1650bffccf72fa93f5fb1397218429d7067d1d754"} Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.254361 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79c0f23f707329d889713da1650bffccf72fa93f5fb1397218429d7067d1d754" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.254402 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.339770 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh"] Jan 21 15:58:36 crc kubenswrapper[4773]: E0121 15:58:36.340439 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79050442-5738-4186-bed8-aab99e80c76b" containerName="extract-utilities" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.340463 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="79050442-5738-4186-bed8-aab99e80c76b" containerName="extract-utilities" Jan 21 15:58:36 crc kubenswrapper[4773]: E0121 15:58:36.340480 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79050442-5738-4186-bed8-aab99e80c76b" containerName="extract-content" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.340488 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="79050442-5738-4186-bed8-aab99e80c76b" containerName="extract-content" Jan 21 15:58:36 crc kubenswrapper[4773]: E0121 15:58:36.340509 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79050442-5738-4186-bed8-aab99e80c76b" containerName="registry-server" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.340518 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="79050442-5738-4186-bed8-aab99e80c76b" containerName="registry-server" Jan 21 15:58:36 crc kubenswrapper[4773]: E0121 15:58:36.340535 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3726d208-3122-4e0e-a802-7f9b0c59621c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.340544 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3726d208-3122-4e0e-a802-7f9b0c59621c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.340780 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3726d208-3122-4e0e-a802-7f9b0c59621c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.340806 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="79050442-5738-4186-bed8-aab99e80c76b" containerName="registry-server" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.341782 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.344872 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.345072 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.345149 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.350009 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.399657 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q94k4\" (UniqueName: \"kubernetes.io/projected/e1bb300f-6606-458a-aacc-3432b7ad314d-kube-api-access-q94k4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.400127 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.400375 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.412089 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh"] Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.501890 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q94k4\" (UniqueName: \"kubernetes.io/projected/e1bb300f-6606-458a-aacc-3432b7ad314d-kube-api-access-q94k4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.502080 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.502232 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.508436 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.508733 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.526864 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q94k4\" (UniqueName: \"kubernetes.io/projected/e1bb300f-6606-458a-aacc-3432b7ad314d-kube-api-access-q94k4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:58:36 crc kubenswrapper[4773]: I0121 15:58:36.673812 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:58:37 crc kubenswrapper[4773]: I0121 15:58:37.347991 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh"] Jan 21 15:58:38 crc kubenswrapper[4773]: I0121 15:58:38.276269 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" event={"ID":"e1bb300f-6606-458a-aacc-3432b7ad314d","Type":"ContainerStarted","Data":"7f240e50344018f90504634173cef8d1292645ea3ce999bef2e1c669e2e904c5"} Jan 21 15:58:38 crc kubenswrapper[4773]: I0121 15:58:38.276821 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" event={"ID":"e1bb300f-6606-458a-aacc-3432b7ad314d","Type":"ContainerStarted","Data":"5b9f580847eb3fa60267f199ee2d283b8afe1dbc034a25d9252833969285cdec"} Jan 21 15:58:38 crc kubenswrapper[4773]: I0121 15:58:38.298277 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" podStartSLOduration=1.8861054560000001 podStartE2EDuration="2.298252406s" podCreationTimestamp="2026-01-21 15:58:36 +0000 UTC" firstStartedPulling="2026-01-21 15:58:37.353168825 +0000 UTC m=+2082.277658447" lastFinishedPulling="2026-01-21 15:58:37.765315775 +0000 UTC m=+2082.689805397" observedRunningTime="2026-01-21 15:58:38.294731191 +0000 UTC m=+2083.219220833" watchObservedRunningTime="2026-01-21 15:58:38.298252406 +0000 UTC m=+2083.222742028" Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.064347 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f175-account-create-update-rdlpn"] Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.075443 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-p97m2"] Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.093965 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-k9sdj"] Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.104349 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-ss2cc"] Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.114020 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-k9sdj"] Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.126620 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f175-account-create-update-rdlpn"] Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.159607 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-p97m2"] Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.168616 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-ss2cc"] Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.396861 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="057373ae-c81b-4e2b-a71c-aa81a62c4465" path="/var/lib/kubelet/pods/057373ae-c81b-4e2b-a71c-aa81a62c4465/volumes" Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.397450 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f53fe25-e64e-4570-989f-ae65e7e23a71" path="/var/lib/kubelet/pods/3f53fe25-e64e-4570-989f-ae65e7e23a71/volumes" Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.397973 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e686b20-33ac-474d-a46b-ea308b32cbf3" path="/var/lib/kubelet/pods/4e686b20-33ac-474d-a46b-ea308b32cbf3/volumes" Jan 21 15:58:39 crc kubenswrapper[4773]: I0121 15:58:39.398685 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6abc2d7-7db5-4e83-9aa3-e8beb591cb41" path="/var/lib/kubelet/pods/e6abc2d7-7db5-4e83-9aa3-e8beb591cb41/volumes" Jan 21 15:58:40 crc kubenswrapper[4773]: I0121 15:58:40.026314 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d93f-account-create-update-6wdd5"] Jan 21 15:58:40 crc kubenswrapper[4773]: I0121 15:58:40.035916 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d93f-account-create-update-6wdd5"] Jan 21 15:58:41 crc kubenswrapper[4773]: I0121 15:58:41.395916 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dede4ee8-95b2-4feb-bedb-4fb9615014f4" path="/var/lib/kubelet/pods/dede4ee8-95b2-4feb-bedb-4fb9615014f4/volumes" Jan 21 15:58:42 crc kubenswrapper[4773]: I0121 15:58:42.027477 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-40ec-account-create-update-8t8mw"] Jan 21 15:58:42 crc kubenswrapper[4773]: I0121 15:58:42.037981 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-40ec-account-create-update-8t8mw"] Jan 21 15:58:43 crc kubenswrapper[4773]: I0121 15:58:43.396914 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bfe59ba-7b63-4e75-825b-05b03663557b" path="/var/lib/kubelet/pods/5bfe59ba-7b63-4e75-825b-05b03663557b/volumes" Jan 21 15:58:55 crc kubenswrapper[4773]: I0121 15:58:55.205668 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:58:55 crc kubenswrapper[4773]: I0121 15:58:55.206256 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:59:21 crc kubenswrapper[4773]: I0121 15:59:21.023685 4773 scope.go:117] "RemoveContainer" containerID="6bbcaafb0d2a1ffe8ce7624a1a952fa65c970ab2d42da71b43fe6beffa4955e7" Jan 21 15:59:21 crc kubenswrapper[4773]: I0121 15:59:21.049585 4773 scope.go:117] "RemoveContainer" containerID="6be08ef4f36b8ad74ccf236c93896982efedc5d5f17b0cdaef5550cf94a567fa" Jan 21 15:59:21 crc kubenswrapper[4773]: I0121 15:59:21.102005 4773 scope.go:117] "RemoveContainer" containerID="abf92234ffc1d745ba6bda73402cfe734e11c2572ac7043a6cf0fc19945b7df5" Jan 21 15:59:21 crc kubenswrapper[4773]: I0121 15:59:21.190790 4773 scope.go:117] "RemoveContainer" containerID="d8f84c5d8bfb76a047741159ac1ae3f7705824fb8d50f6fa096960f26cc116ce" Jan 21 15:59:21 crc kubenswrapper[4773]: I0121 15:59:21.243063 4773 scope.go:117] "RemoveContainer" containerID="535d1a60cdad8c51a446fe9407095054f6e60a7c07816c48521ee6f3b97260b2" Jan 21 15:59:21 crc kubenswrapper[4773]: I0121 15:59:21.308359 4773 scope.go:117] "RemoveContainer" containerID="c59894bbd3d82a699d43d1d37f0de0562c8b8a982700a5f942370c63309692ed" Jan 21 15:59:25 crc kubenswrapper[4773]: I0121 15:59:25.205342 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:59:25 crc kubenswrapper[4773]: I0121 15:59:25.205888 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:59:25 crc kubenswrapper[4773]: I0121 15:59:25.205929 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 15:59:25 crc kubenswrapper[4773]: I0121 15:59:25.206771 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17e7142fafa8e0349bab7f85c21294d342dcff1249216c34be3507a9473a3db2"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:59:25 crc kubenswrapper[4773]: I0121 15:59:25.206829 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://17e7142fafa8e0349bab7f85c21294d342dcff1249216c34be3507a9473a3db2" gracePeriod=600 Jan 21 15:59:25 crc kubenswrapper[4773]: I0121 15:59:25.847274 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="17e7142fafa8e0349bab7f85c21294d342dcff1249216c34be3507a9473a3db2" exitCode=0 Jan 21 15:59:25 crc kubenswrapper[4773]: I0121 15:59:25.847638 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"17e7142fafa8e0349bab7f85c21294d342dcff1249216c34be3507a9473a3db2"} Jan 21 15:59:25 crc kubenswrapper[4773]: I0121 15:59:25.847674 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8"} Jan 21 15:59:25 crc kubenswrapper[4773]: I0121 15:59:25.847716 4773 scope.go:117] "RemoveContainer" containerID="f57f09d8999795ea6ef26c19dc6c8ea32d826453e5a4e1cfbcba37a133288bf2" Jan 21 15:59:39 crc kubenswrapper[4773]: I0121 15:59:39.074124 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6n97z"] Jan 21 15:59:39 crc kubenswrapper[4773]: I0121 15:59:39.123807 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6n97z"] Jan 21 15:59:39 crc kubenswrapper[4773]: I0121 15:59:39.398145 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b95f2a61-8fc4-4257-ad91-d0b45169dc09" path="/var/lib/kubelet/pods/b95f2a61-8fc4-4257-ad91-d0b45169dc09/volumes" Jan 21 15:59:54 crc kubenswrapper[4773]: I0121 15:59:54.104955 4773 generic.go:334] "Generic (PLEG): container finished" podID="e1bb300f-6606-458a-aacc-3432b7ad314d" containerID="7f240e50344018f90504634173cef8d1292645ea3ce999bef2e1c669e2e904c5" exitCode=0 Jan 21 15:59:54 crc kubenswrapper[4773]: I0121 15:59:54.105095 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" event={"ID":"e1bb300f-6606-458a-aacc-3432b7ad314d","Type":"ContainerDied","Data":"7f240e50344018f90504634173cef8d1292645ea3ce999bef2e1c669e2e904c5"} Jan 21 15:59:55 crc kubenswrapper[4773]: I0121 15:59:55.696946 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:59:55 crc kubenswrapper[4773]: I0121 15:59:55.773177 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-inventory\") pod \"e1bb300f-6606-458a-aacc-3432b7ad314d\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " Jan 21 15:59:55 crc kubenswrapper[4773]: I0121 15:59:55.773247 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-ssh-key-openstack-edpm-ipam\") pod \"e1bb300f-6606-458a-aacc-3432b7ad314d\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " Jan 21 15:59:55 crc kubenswrapper[4773]: I0121 15:59:55.773498 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q94k4\" (UniqueName: \"kubernetes.io/projected/e1bb300f-6606-458a-aacc-3432b7ad314d-kube-api-access-q94k4\") pod \"e1bb300f-6606-458a-aacc-3432b7ad314d\" (UID: \"e1bb300f-6606-458a-aacc-3432b7ad314d\") " Jan 21 15:59:55 crc kubenswrapper[4773]: I0121 15:59:55.779402 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1bb300f-6606-458a-aacc-3432b7ad314d-kube-api-access-q94k4" (OuterVolumeSpecName: "kube-api-access-q94k4") pod "e1bb300f-6606-458a-aacc-3432b7ad314d" (UID: "e1bb300f-6606-458a-aacc-3432b7ad314d"). InnerVolumeSpecName "kube-api-access-q94k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:59:55 crc kubenswrapper[4773]: I0121 15:59:55.808550 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-inventory" (OuterVolumeSpecName: "inventory") pod "e1bb300f-6606-458a-aacc-3432b7ad314d" (UID: "e1bb300f-6606-458a-aacc-3432b7ad314d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:55 crc kubenswrapper[4773]: I0121 15:59:55.810766 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e1bb300f-6606-458a-aacc-3432b7ad314d" (UID: "e1bb300f-6606-458a-aacc-3432b7ad314d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:59:55 crc kubenswrapper[4773]: I0121 15:59:55.877107 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q94k4\" (UniqueName: \"kubernetes.io/projected/e1bb300f-6606-458a-aacc-3432b7ad314d-kube-api-access-q94k4\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:55 crc kubenswrapper[4773]: I0121 15:59:55.877169 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:55 crc kubenswrapper[4773]: I0121 15:59:55.877194 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1bb300f-6606-458a-aacc-3432b7ad314d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.128859 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" event={"ID":"e1bb300f-6606-458a-aacc-3432b7ad314d","Type":"ContainerDied","Data":"5b9f580847eb3fa60267f199ee2d283b8afe1dbc034a25d9252833969285cdec"} Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.129229 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b9f580847eb3fa60267f199ee2d283b8afe1dbc034a25d9252833969285cdec" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.128917 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.212840 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626"] Jan 21 15:59:56 crc kubenswrapper[4773]: E0121 15:59:56.213361 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1bb300f-6606-458a-aacc-3432b7ad314d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.213385 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1bb300f-6606-458a-aacc-3432b7ad314d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.213628 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1bb300f-6606-458a-aacc-3432b7ad314d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.214545 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.218531 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.218829 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.218989 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.219139 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.224059 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626"] Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.285998 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hhw7\" (UniqueName: \"kubernetes.io/projected/58070518-459d-4437-8aa7-7a532264b18d-kube-api-access-6hhw7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4b626\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.286374 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4b626\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.286470 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4b626\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.388711 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hhw7\" (UniqueName: \"kubernetes.io/projected/58070518-459d-4437-8aa7-7a532264b18d-kube-api-access-6hhw7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4b626\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.388867 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4b626\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.388915 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4b626\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.394531 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4b626\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.405386 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4b626\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.411238 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hhw7\" (UniqueName: \"kubernetes.io/projected/58070518-459d-4437-8aa7-7a532264b18d-kube-api-access-6hhw7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4b626\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 15:59:56 crc kubenswrapper[4773]: I0121 15:59:56.532625 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 15:59:57 crc kubenswrapper[4773]: I0121 15:59:57.153967 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626"] Jan 21 15:59:58 crc kubenswrapper[4773]: I0121 15:59:58.148782 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" event={"ID":"58070518-459d-4437-8aa7-7a532264b18d","Type":"ContainerStarted","Data":"a2887ec44407764bd1ea80b2186d9f3a49dbe7f2a2e8ca0bdd21756896b7b160"} Jan 21 15:59:58 crc kubenswrapper[4773]: I0121 15:59:58.149308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" event={"ID":"58070518-459d-4437-8aa7-7a532264b18d","Type":"ContainerStarted","Data":"a8837e5369f1f06637c6f4c6cfd24041d531f1087014d9a69d0606d092f8ac2a"} Jan 21 15:59:59 crc kubenswrapper[4773]: I0121 15:59:59.174458 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" podStartSLOduration=2.583054685 podStartE2EDuration="3.174432432s" podCreationTimestamp="2026-01-21 15:59:56 +0000 UTC" firstStartedPulling="2026-01-21 15:59:57.156780356 +0000 UTC m=+2162.081269978" lastFinishedPulling="2026-01-21 15:59:57.748158103 +0000 UTC m=+2162.672647725" observedRunningTime="2026-01-21 15:59:59.17400725 +0000 UTC m=+2164.098496872" watchObservedRunningTime="2026-01-21 15:59:59.174432432 +0000 UTC m=+2164.098922054" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.201567 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9"] Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.233126 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.236000 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.240123 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.272247 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9"] Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.323636 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/289fa5e2-355b-4012-853c-aad63a1cc1fe-config-volume\") pod \"collect-profiles-29483520-rxgw9\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.323912 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/289fa5e2-355b-4012-853c-aad63a1cc1fe-secret-volume\") pod \"collect-profiles-29483520-rxgw9\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.324076 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bpjv\" (UniqueName: \"kubernetes.io/projected/289fa5e2-355b-4012-853c-aad63a1cc1fe-kube-api-access-2bpjv\") pod \"collect-profiles-29483520-rxgw9\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.425660 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bpjv\" (UniqueName: \"kubernetes.io/projected/289fa5e2-355b-4012-853c-aad63a1cc1fe-kube-api-access-2bpjv\") pod \"collect-profiles-29483520-rxgw9\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.426157 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/289fa5e2-355b-4012-853c-aad63a1cc1fe-config-volume\") pod \"collect-profiles-29483520-rxgw9\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.426376 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/289fa5e2-355b-4012-853c-aad63a1cc1fe-secret-volume\") pod \"collect-profiles-29483520-rxgw9\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.428964 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/289fa5e2-355b-4012-853c-aad63a1cc1fe-config-volume\") pod \"collect-profiles-29483520-rxgw9\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.435371 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/289fa5e2-355b-4012-853c-aad63a1cc1fe-secret-volume\") pod \"collect-profiles-29483520-rxgw9\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.444024 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bpjv\" (UniqueName: \"kubernetes.io/projected/289fa5e2-355b-4012-853c-aad63a1cc1fe-kube-api-access-2bpjv\") pod \"collect-profiles-29483520-rxgw9\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:00 crc kubenswrapper[4773]: I0121 16:00:00.579959 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:01 crc kubenswrapper[4773]: I0121 16:00:01.146774 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9"] Jan 21 16:00:01 crc kubenswrapper[4773]: I0121 16:00:01.243075 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" event={"ID":"289fa5e2-355b-4012-853c-aad63a1cc1fe","Type":"ContainerStarted","Data":"5a444c0baf333fd1de42a889543216d2e8dad0227a88e716e668ae8b90de7c9e"} Jan 21 16:00:02 crc kubenswrapper[4773]: I0121 16:00:02.255913 4773 generic.go:334] "Generic (PLEG): container finished" podID="289fa5e2-355b-4012-853c-aad63a1cc1fe" containerID="86bf2dfabc1458b3bb6ece8f85223de64a2c69bac7c905cc191e6d8383fac4a5" exitCode=0 Jan 21 16:00:02 crc kubenswrapper[4773]: I0121 16:00:02.256008 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" event={"ID":"289fa5e2-355b-4012-853c-aad63a1cc1fe","Type":"ContainerDied","Data":"86bf2dfabc1458b3bb6ece8f85223de64a2c69bac7c905cc191e6d8383fac4a5"} Jan 21 16:00:03 crc kubenswrapper[4773]: I0121 16:00:03.042664 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xhr58"] Jan 21 16:00:03 crc kubenswrapper[4773]: I0121 16:00:03.054740 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xhr58"] Jan 21 16:00:03 crc kubenswrapper[4773]: I0121 16:00:03.409143 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da351801-59dd-48f7-a0cd-4b5466057278" path="/var/lib/kubelet/pods/da351801-59dd-48f7-a0cd-4b5466057278/volumes" Jan 21 16:00:03 crc kubenswrapper[4773]: I0121 16:00:03.823619 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:03 crc kubenswrapper[4773]: I0121 16:00:03.895810 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7bwts"] Jan 21 16:00:03 crc kubenswrapper[4773]: E0121 16:00:03.896282 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="289fa5e2-355b-4012-853c-aad63a1cc1fe" containerName="collect-profiles" Jan 21 16:00:03 crc kubenswrapper[4773]: I0121 16:00:03.896304 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="289fa5e2-355b-4012-853c-aad63a1cc1fe" containerName="collect-profiles" Jan 21 16:00:03 crc kubenswrapper[4773]: I0121 16:00:03.896516 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="289fa5e2-355b-4012-853c-aad63a1cc1fe" containerName="collect-profiles" Jan 21 16:00:03 crc kubenswrapper[4773]: I0121 16:00:03.898462 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:03 crc kubenswrapper[4773]: I0121 16:00:03.920279 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bwts"] Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.022400 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/289fa5e2-355b-4012-853c-aad63a1cc1fe-secret-volume\") pod \"289fa5e2-355b-4012-853c-aad63a1cc1fe\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.022541 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/289fa5e2-355b-4012-853c-aad63a1cc1fe-config-volume\") pod \"289fa5e2-355b-4012-853c-aad63a1cc1fe\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.023590 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289fa5e2-355b-4012-853c-aad63a1cc1fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "289fa5e2-355b-4012-853c-aad63a1cc1fe" (UID: "289fa5e2-355b-4012-853c-aad63a1cc1fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.034197 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/289fa5e2-355b-4012-853c-aad63a1cc1fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "289fa5e2-355b-4012-853c-aad63a1cc1fe" (UID: "289fa5e2-355b-4012-853c-aad63a1cc1fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.046772 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bpjv\" (UniqueName: \"kubernetes.io/projected/289fa5e2-355b-4012-853c-aad63a1cc1fe-kube-api-access-2bpjv\") pod \"289fa5e2-355b-4012-853c-aad63a1cc1fe\" (UID: \"289fa5e2-355b-4012-853c-aad63a1cc1fe\") " Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.048496 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-catalog-content\") pod \"certified-operators-7bwts\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.048653 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76vd8\" (UniqueName: \"kubernetes.io/projected/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-kube-api-access-76vd8\") pod \"certified-operators-7bwts\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.048784 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-utilities\") pod \"certified-operators-7bwts\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.049017 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/289fa5e2-355b-4012-853c-aad63a1cc1fe-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.049037 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/289fa5e2-355b-4012-853c-aad63a1cc1fe-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.091457 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/289fa5e2-355b-4012-853c-aad63a1cc1fe-kube-api-access-2bpjv" (OuterVolumeSpecName: "kube-api-access-2bpjv") pod "289fa5e2-355b-4012-853c-aad63a1cc1fe" (UID: "289fa5e2-355b-4012-853c-aad63a1cc1fe"). InnerVolumeSpecName "kube-api-access-2bpjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.151517 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-utilities\") pod \"certified-operators-7bwts\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.152051 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-catalog-content\") pod \"certified-operators-7bwts\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.152244 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-utilities\") pod \"certified-operators-7bwts\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.152428 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-catalog-content\") pod \"certified-operators-7bwts\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.152569 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76vd8\" (UniqueName: \"kubernetes.io/projected/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-kube-api-access-76vd8\") pod \"certified-operators-7bwts\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.153041 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bpjv\" (UniqueName: \"kubernetes.io/projected/289fa5e2-355b-4012-853c-aad63a1cc1fe-kube-api-access-2bpjv\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.186887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76vd8\" (UniqueName: \"kubernetes.io/projected/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-kube-api-access-76vd8\") pod \"certified-operators-7bwts\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.226051 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.287971 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.288068 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9" event={"ID":"289fa5e2-355b-4012-853c-aad63a1cc1fe","Type":"ContainerDied","Data":"5a444c0baf333fd1de42a889543216d2e8dad0227a88e716e668ae8b90de7c9e"} Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.288141 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a444c0baf333fd1de42a889543216d2e8dad0227a88e716e668ae8b90de7c9e" Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.294217 4773 generic.go:334] "Generic (PLEG): container finished" podID="58070518-459d-4437-8aa7-7a532264b18d" containerID="a2887ec44407764bd1ea80b2186d9f3a49dbe7f2a2e8ca0bdd21756896b7b160" exitCode=0 Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.294276 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" event={"ID":"58070518-459d-4437-8aa7-7a532264b18d","Type":"ContainerDied","Data":"a2887ec44407764bd1ea80b2186d9f3a49dbe7f2a2e8ca0bdd21756896b7b160"} Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.833089 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bwts"] Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.925456 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl"] Jan 21 16:00:04 crc kubenswrapper[4773]: I0121 16:00:04.937601 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483475-7ftkl"] Jan 21 16:00:05 crc kubenswrapper[4773]: I0121 16:00:05.309267 4773 generic.go:334] "Generic (PLEG): container finished" podID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerID="5e252f96e21b5dfa03769b02dbde88e73aa989a2ee8aa7ebb3af629ec805c19c" exitCode=0 Jan 21 16:00:05 crc kubenswrapper[4773]: I0121 16:00:05.309445 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bwts" event={"ID":"a8683d43-445f-40da-a2fc-6a81e9fc2e1b","Type":"ContainerDied","Data":"5e252f96e21b5dfa03769b02dbde88e73aa989a2ee8aa7ebb3af629ec805c19c"} Jan 21 16:00:05 crc kubenswrapper[4773]: I0121 16:00:05.309514 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bwts" event={"ID":"a8683d43-445f-40da-a2fc-6a81e9fc2e1b","Type":"ContainerStarted","Data":"27ebf2d1409fd419068a6722fd4c43454ba7c291d913f4c18d9392182de31380"} Jan 21 16:00:05 crc kubenswrapper[4773]: I0121 16:00:05.400205 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c3a2458-cc1f-489a-9ce5-57d651ea1754" path="/var/lib/kubelet/pods/8c3a2458-cc1f-489a-9ce5-57d651ea1754/volumes" Jan 21 16:00:05 crc kubenswrapper[4773]: I0121 16:00:05.923628 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 16:00:05 crc kubenswrapper[4773]: I0121 16:00:05.991525 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-inventory\") pod \"58070518-459d-4437-8aa7-7a532264b18d\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " Jan 21 16:00:05 crc kubenswrapper[4773]: I0121 16:00:05.991576 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-ssh-key-openstack-edpm-ipam\") pod \"58070518-459d-4437-8aa7-7a532264b18d\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " Jan 21 16:00:05 crc kubenswrapper[4773]: I0121 16:00:05.991757 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hhw7\" (UniqueName: \"kubernetes.io/projected/58070518-459d-4437-8aa7-7a532264b18d-kube-api-access-6hhw7\") pod \"58070518-459d-4437-8aa7-7a532264b18d\" (UID: \"58070518-459d-4437-8aa7-7a532264b18d\") " Jan 21 16:00:05 crc kubenswrapper[4773]: I0121 16:00:05.997105 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58070518-459d-4437-8aa7-7a532264b18d-kube-api-access-6hhw7" (OuterVolumeSpecName: "kube-api-access-6hhw7") pod "58070518-459d-4437-8aa7-7a532264b18d" (UID: "58070518-459d-4437-8aa7-7a532264b18d"). InnerVolumeSpecName "kube-api-access-6hhw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.025150 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-inventory" (OuterVolumeSpecName: "inventory") pod "58070518-459d-4437-8aa7-7a532264b18d" (UID: "58070518-459d-4437-8aa7-7a532264b18d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.027123 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "58070518-459d-4437-8aa7-7a532264b18d" (UID: "58070518-459d-4437-8aa7-7a532264b18d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.095597 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hhw7\" (UniqueName: \"kubernetes.io/projected/58070518-459d-4437-8aa7-7a532264b18d-kube-api-access-6hhw7\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.095635 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.095650 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58070518-459d-4437-8aa7-7a532264b18d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.321032 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" event={"ID":"58070518-459d-4437-8aa7-7a532264b18d","Type":"ContainerDied","Data":"a8837e5369f1f06637c6f4c6cfd24041d531f1087014d9a69d0606d092f8ac2a"} Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.321066 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4b626" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.321070 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8837e5369f1f06637c6f4c6cfd24041d531f1087014d9a69d0606d092f8ac2a" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.406456 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h"] Jan 21 16:00:06 crc kubenswrapper[4773]: E0121 16:00:06.407132 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58070518-459d-4437-8aa7-7a532264b18d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.407161 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="58070518-459d-4437-8aa7-7a532264b18d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.407461 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="58070518-459d-4437-8aa7-7a532264b18d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.408373 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.410572 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.410945 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.411369 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.411485 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.450245 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h"] Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.507498 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-msv6h\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.507631 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpzxh\" (UniqueName: \"kubernetes.io/projected/6bb2f7be-feb4-4081-a299-204701555c02-kube-api-access-gpzxh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-msv6h\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.507784 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-msv6h\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.611273 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-msv6h\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.611396 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpzxh\" (UniqueName: \"kubernetes.io/projected/6bb2f7be-feb4-4081-a299-204701555c02-kube-api-access-gpzxh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-msv6h\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.611512 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-msv6h\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.617941 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-msv6h\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.618406 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-msv6h\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.629154 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpzxh\" (UniqueName: \"kubernetes.io/projected/6bb2f7be-feb4-4081-a299-204701555c02-kube-api-access-gpzxh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-msv6h\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:06 crc kubenswrapper[4773]: I0121 16:00:06.756605 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:07 crc kubenswrapper[4773]: I0121 16:00:07.348339 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bwts" event={"ID":"a8683d43-445f-40da-a2fc-6a81e9fc2e1b","Type":"ContainerStarted","Data":"1d18890302d3930cdf3ba42c9ac1de7e39d4dba939605133c5f47f3495fe70a1"} Jan 21 16:00:07 crc kubenswrapper[4773]: I0121 16:00:07.359186 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h"] Jan 21 16:00:08 crc kubenswrapper[4773]: I0121 16:00:08.359631 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" event={"ID":"6bb2f7be-feb4-4081-a299-204701555c02","Type":"ContainerStarted","Data":"bfd7abcc5233b541d49f2172b9a9aea91d51f84507af6575671c098270617f70"} Jan 21 16:00:09 crc kubenswrapper[4773]: I0121 16:00:09.375674 4773 generic.go:334] "Generic (PLEG): container finished" podID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerID="1d18890302d3930cdf3ba42c9ac1de7e39d4dba939605133c5f47f3495fe70a1" exitCode=0 Jan 21 16:00:09 crc kubenswrapper[4773]: I0121 16:00:09.375772 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bwts" event={"ID":"a8683d43-445f-40da-a2fc-6a81e9fc2e1b","Type":"ContainerDied","Data":"1d18890302d3930cdf3ba42c9ac1de7e39d4dba939605133c5f47f3495fe70a1"} Jan 21 16:00:10 crc kubenswrapper[4773]: I0121 16:00:10.387733 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" event={"ID":"6bb2f7be-feb4-4081-a299-204701555c02","Type":"ContainerStarted","Data":"e153ad47e847460739fecc3dbdaafa1756760d7db1cfde82774a27ecb24b1b3a"} Jan 21 16:00:10 crc kubenswrapper[4773]: I0121 16:00:10.411441 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" podStartSLOduration=2.548098948 podStartE2EDuration="4.411420368s" podCreationTimestamp="2026-01-21 16:00:06 +0000 UTC" firstStartedPulling="2026-01-21 16:00:07.338918921 +0000 UTC m=+2172.263408543" lastFinishedPulling="2026-01-21 16:00:09.202240341 +0000 UTC m=+2174.126729963" observedRunningTime="2026-01-21 16:00:10.407283264 +0000 UTC m=+2175.331772886" watchObservedRunningTime="2026-01-21 16:00:10.411420368 +0000 UTC m=+2175.335909980" Jan 21 16:00:11 crc kubenswrapper[4773]: I0121 16:00:11.400279 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bwts" event={"ID":"a8683d43-445f-40da-a2fc-6a81e9fc2e1b","Type":"ContainerStarted","Data":"80d7cdd1081dbe0174f909725e22f39d346b2206458ab19b1cad27fb5783fb75"} Jan 21 16:00:11 crc kubenswrapper[4773]: I0121 16:00:11.425836 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7bwts" podStartSLOduration=3.425910146 podStartE2EDuration="8.425816303s" podCreationTimestamp="2026-01-21 16:00:03 +0000 UTC" firstStartedPulling="2026-01-21 16:00:05.311328354 +0000 UTC m=+2170.235817976" lastFinishedPulling="2026-01-21 16:00:10.311234511 +0000 UTC m=+2175.235724133" observedRunningTime="2026-01-21 16:00:11.420298892 +0000 UTC m=+2176.344788514" watchObservedRunningTime="2026-01-21 16:00:11.425816303 +0000 UTC m=+2176.350305925" Jan 21 16:00:13 crc kubenswrapper[4773]: I0121 16:00:13.038815 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-djq9w"] Jan 21 16:00:13 crc kubenswrapper[4773]: I0121 16:00:13.047284 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-djq9w"] Jan 21 16:00:13 crc kubenswrapper[4773]: I0121 16:00:13.395890 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815b33e5-5403-49d5-941d-8dc85c57a336" path="/var/lib/kubelet/pods/815b33e5-5403-49d5-941d-8dc85c57a336/volumes" Jan 21 16:00:14 crc kubenswrapper[4773]: I0121 16:00:14.226871 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:14 crc kubenswrapper[4773]: I0121 16:00:14.226944 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:15 crc kubenswrapper[4773]: I0121 16:00:15.284296 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7bwts" podUID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerName="registry-server" probeResult="failure" output=< Jan 21 16:00:15 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 16:00:15 crc kubenswrapper[4773]: > Jan 21 16:00:21 crc kubenswrapper[4773]: I0121 16:00:21.455971 4773 scope.go:117] "RemoveContainer" containerID="b48d8918e0041b67b95045ce1886f07f41b253b7b11712c8b5045b0015b572d0" Jan 21 16:00:21 crc kubenswrapper[4773]: I0121 16:00:21.505751 4773 scope.go:117] "RemoveContainer" containerID="078c10f544cb57265747dcccf59de4ec4af518ed1875b8edf3aa3cce8a6c25fb" Jan 21 16:00:21 crc kubenswrapper[4773]: I0121 16:00:21.562827 4773 scope.go:117] "RemoveContainer" containerID="b9845560e1a2299efc2f6877bf13cb25b6c6677446a3213edef4dfad7a755dc9" Jan 21 16:00:21 crc kubenswrapper[4773]: I0121 16:00:21.601084 4773 scope.go:117] "RemoveContainer" containerID="f4607388de82d4843137cf8a6d25aa39d8e8cc3f85e0dfd6d7c406f904ac2d29" Jan 21 16:00:24 crc kubenswrapper[4773]: I0121 16:00:24.332921 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:24 crc kubenswrapper[4773]: I0121 16:00:24.390119 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:24 crc kubenswrapper[4773]: I0121 16:00:24.579067 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7bwts"] Jan 21 16:00:25 crc kubenswrapper[4773]: I0121 16:00:25.534119 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7bwts" podUID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerName="registry-server" containerID="cri-o://80d7cdd1081dbe0174f909725e22f39d346b2206458ab19b1cad27fb5783fb75" gracePeriod=2 Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.552540 4773 generic.go:334] "Generic (PLEG): container finished" podID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerID="80d7cdd1081dbe0174f909725e22f39d346b2206458ab19b1cad27fb5783fb75" exitCode=0 Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.552631 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bwts" event={"ID":"a8683d43-445f-40da-a2fc-6a81e9fc2e1b","Type":"ContainerDied","Data":"80d7cdd1081dbe0174f909725e22f39d346b2206458ab19b1cad27fb5783fb75"} Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.781865 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.884520 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-catalog-content\") pod \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.884654 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76vd8\" (UniqueName: \"kubernetes.io/projected/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-kube-api-access-76vd8\") pod \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.884782 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-utilities\") pod \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\" (UID: \"a8683d43-445f-40da-a2fc-6a81e9fc2e1b\") " Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.885666 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-utilities" (OuterVolumeSpecName: "utilities") pod "a8683d43-445f-40da-a2fc-6a81e9fc2e1b" (UID: "a8683d43-445f-40da-a2fc-6a81e9fc2e1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.910670 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-kube-api-access-76vd8" (OuterVolumeSpecName: "kube-api-access-76vd8") pod "a8683d43-445f-40da-a2fc-6a81e9fc2e1b" (UID: "a8683d43-445f-40da-a2fc-6a81e9fc2e1b"). InnerVolumeSpecName "kube-api-access-76vd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.942251 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8683d43-445f-40da-a2fc-6a81e9fc2e1b" (UID: "a8683d43-445f-40da-a2fc-6a81e9fc2e1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.987612 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.987651 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76vd8\" (UniqueName: \"kubernetes.io/projected/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-kube-api-access-76vd8\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:26 crc kubenswrapper[4773]: I0121 16:00:26.987663 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8683d43-445f-40da-a2fc-6a81e9fc2e1b-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:27 crc kubenswrapper[4773]: I0121 16:00:27.640122 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bwts" event={"ID":"a8683d43-445f-40da-a2fc-6a81e9fc2e1b","Type":"ContainerDied","Data":"27ebf2d1409fd419068a6722fd4c43454ba7c291d913f4c18d9392182de31380"} Jan 21 16:00:27 crc kubenswrapper[4773]: I0121 16:00:27.640208 4773 scope.go:117] "RemoveContainer" containerID="80d7cdd1081dbe0174f909725e22f39d346b2206458ab19b1cad27fb5783fb75" Jan 21 16:00:27 crc kubenswrapper[4773]: I0121 16:00:27.640500 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bwts" Jan 21 16:00:27 crc kubenswrapper[4773]: I0121 16:00:27.686858 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7bwts"] Jan 21 16:00:27 crc kubenswrapper[4773]: I0121 16:00:27.691922 4773 scope.go:117] "RemoveContainer" containerID="1d18890302d3930cdf3ba42c9ac1de7e39d4dba939605133c5f47f3495fe70a1" Jan 21 16:00:27 crc kubenswrapper[4773]: I0121 16:00:27.696231 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7bwts"] Jan 21 16:00:27 crc kubenswrapper[4773]: I0121 16:00:27.718331 4773 scope.go:117] "RemoveContainer" containerID="5e252f96e21b5dfa03769b02dbde88e73aa989a2ee8aa7ebb3af629ec805c19c" Jan 21 16:00:29 crc kubenswrapper[4773]: I0121 16:00:29.399595 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" path="/var/lib/kubelet/pods/a8683d43-445f-40da-a2fc-6a81e9fc2e1b/volumes" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.494199 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zn878"] Jan 21 16:00:37 crc kubenswrapper[4773]: E0121 16:00:37.495310 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerName="extract-utilities" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.495329 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerName="extract-utilities" Jan 21 16:00:37 crc kubenswrapper[4773]: E0121 16:00:37.495351 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerName="registry-server" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.495360 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerName="registry-server" Jan 21 16:00:37 crc kubenswrapper[4773]: E0121 16:00:37.495384 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerName="extract-content" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.495395 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerName="extract-content" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.495627 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8683d43-445f-40da-a2fc-6a81e9fc2e1b" containerName="registry-server" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.497550 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.505673 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn878"] Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.606822 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw27z\" (UniqueName: \"kubernetes.io/projected/8393ffe3-273e-4a3a-8f5c-434690d84cca-kube-api-access-zw27z\") pod \"redhat-marketplace-zn878\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.606895 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-catalog-content\") pod \"redhat-marketplace-zn878\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.607021 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-utilities\") pod \"redhat-marketplace-zn878\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.708810 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-utilities\") pod \"redhat-marketplace-zn878\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.708976 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw27z\" (UniqueName: \"kubernetes.io/projected/8393ffe3-273e-4a3a-8f5c-434690d84cca-kube-api-access-zw27z\") pod \"redhat-marketplace-zn878\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.709018 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-catalog-content\") pod \"redhat-marketplace-zn878\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.709434 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-utilities\") pod \"redhat-marketplace-zn878\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.709579 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-catalog-content\") pod \"redhat-marketplace-zn878\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.733326 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw27z\" (UniqueName: \"kubernetes.io/projected/8393ffe3-273e-4a3a-8f5c-434690d84cca-kube-api-access-zw27z\") pod \"redhat-marketplace-zn878\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:37 crc kubenswrapper[4773]: I0121 16:00:37.869149 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:38 crc kubenswrapper[4773]: I0121 16:00:38.417694 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn878"] Jan 21 16:00:38 crc kubenswrapper[4773]: I0121 16:00:38.875569 4773 generic.go:334] "Generic (PLEG): container finished" podID="8393ffe3-273e-4a3a-8f5c-434690d84cca" containerID="3a91558b312de73768ef22e56f7d18c264b2fc06bd0847f7989c6d98ddb477c8" exitCode=0 Jan 21 16:00:38 crc kubenswrapper[4773]: I0121 16:00:38.876735 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn878" event={"ID":"8393ffe3-273e-4a3a-8f5c-434690d84cca","Type":"ContainerDied","Data":"3a91558b312de73768ef22e56f7d18c264b2fc06bd0847f7989c6d98ddb477c8"} Jan 21 16:00:38 crc kubenswrapper[4773]: I0121 16:00:38.876793 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn878" event={"ID":"8393ffe3-273e-4a3a-8f5c-434690d84cca","Type":"ContainerStarted","Data":"ebb8dd064f8b1d2e7b347333484b83eefedad5949d5557ac6312e31bb0e5982f"} Jan 21 16:00:42 crc kubenswrapper[4773]: I0121 16:00:42.923433 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn878" event={"ID":"8393ffe3-273e-4a3a-8f5c-434690d84cca","Type":"ContainerStarted","Data":"99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866"} Jan 21 16:00:44 crc kubenswrapper[4773]: I0121 16:00:44.945745 4773 generic.go:334] "Generic (PLEG): container finished" podID="8393ffe3-273e-4a3a-8f5c-434690d84cca" containerID="99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866" exitCode=0 Jan 21 16:00:44 crc kubenswrapper[4773]: I0121 16:00:44.945806 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn878" event={"ID":"8393ffe3-273e-4a3a-8f5c-434690d84cca","Type":"ContainerDied","Data":"99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866"} Jan 21 16:00:49 crc kubenswrapper[4773]: I0121 16:00:49.992960 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn878" event={"ID":"8393ffe3-273e-4a3a-8f5c-434690d84cca","Type":"ContainerStarted","Data":"1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4"} Jan 21 16:00:49 crc kubenswrapper[4773]: I0121 16:00:49.994861 4773 generic.go:334] "Generic (PLEG): container finished" podID="6bb2f7be-feb4-4081-a299-204701555c02" containerID="e153ad47e847460739fecc3dbdaafa1756760d7db1cfde82774a27ecb24b1b3a" exitCode=0 Jan 21 16:00:49 crc kubenswrapper[4773]: I0121 16:00:49.994898 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" event={"ID":"6bb2f7be-feb4-4081-a299-204701555c02","Type":"ContainerDied","Data":"e153ad47e847460739fecc3dbdaafa1756760d7db1cfde82774a27ecb24b1b3a"} Jan 21 16:00:50 crc kubenswrapper[4773]: I0121 16:00:50.018817 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zn878" podStartSLOduration=3.6765929120000003 podStartE2EDuration="13.018790718s" podCreationTimestamp="2026-01-21 16:00:37 +0000 UTC" firstStartedPulling="2026-01-21 16:00:38.884363645 +0000 UTC m=+2203.808853267" lastFinishedPulling="2026-01-21 16:00:48.226561451 +0000 UTC m=+2213.151051073" observedRunningTime="2026-01-21 16:00:50.017472352 +0000 UTC m=+2214.941961974" watchObservedRunningTime="2026-01-21 16:00:50.018790718 +0000 UTC m=+2214.943280340" Jan 21 16:00:50 crc kubenswrapper[4773]: I0121 16:00:50.083133 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-9s9qp"] Jan 21 16:00:50 crc kubenswrapper[4773]: I0121 16:00:50.091337 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-9s9qp"] Jan 21 16:00:51 crc kubenswrapper[4773]: I0121 16:00:51.398537 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d70a183-ee4f-41e0-9b98-1f921165cecb" path="/var/lib/kubelet/pods/0d70a183-ee4f-41e0-9b98-1f921165cecb/volumes" Jan 21 16:00:51 crc kubenswrapper[4773]: I0121 16:00:51.550129 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:51 crc kubenswrapper[4773]: I0121 16:00:51.641179 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-ssh-key-openstack-edpm-ipam\") pod \"6bb2f7be-feb4-4081-a299-204701555c02\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " Jan 21 16:00:51 crc kubenswrapper[4773]: I0121 16:00:51.641273 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-inventory\") pod \"6bb2f7be-feb4-4081-a299-204701555c02\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " Jan 21 16:00:51 crc kubenswrapper[4773]: I0121 16:00:51.641434 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpzxh\" (UniqueName: \"kubernetes.io/projected/6bb2f7be-feb4-4081-a299-204701555c02-kube-api-access-gpzxh\") pod \"6bb2f7be-feb4-4081-a299-204701555c02\" (UID: \"6bb2f7be-feb4-4081-a299-204701555c02\") " Jan 21 16:00:51 crc kubenswrapper[4773]: I0121 16:00:51.663813 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb2f7be-feb4-4081-a299-204701555c02-kube-api-access-gpzxh" (OuterVolumeSpecName: "kube-api-access-gpzxh") pod "6bb2f7be-feb4-4081-a299-204701555c02" (UID: "6bb2f7be-feb4-4081-a299-204701555c02"). InnerVolumeSpecName "kube-api-access-gpzxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:00:51 crc kubenswrapper[4773]: I0121 16:00:51.676996 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-inventory" (OuterVolumeSpecName: "inventory") pod "6bb2f7be-feb4-4081-a299-204701555c02" (UID: "6bb2f7be-feb4-4081-a299-204701555c02"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:51 crc kubenswrapper[4773]: I0121 16:00:51.699903 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6bb2f7be-feb4-4081-a299-204701555c02" (UID: "6bb2f7be-feb4-4081-a299-204701555c02"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:00:51 crc kubenswrapper[4773]: I0121 16:00:51.745648 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpzxh\" (UniqueName: \"kubernetes.io/projected/6bb2f7be-feb4-4081-a299-204701555c02-kube-api-access-gpzxh\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:51 crc kubenswrapper[4773]: I0121 16:00:51.745742 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:51 crc kubenswrapper[4773]: I0121 16:00:51.745821 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bb2f7be-feb4-4081-a299-204701555c02-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.016333 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" event={"ID":"6bb2f7be-feb4-4081-a299-204701555c02","Type":"ContainerDied","Data":"bfd7abcc5233b541d49f2172b9a9aea91d51f84507af6575671c098270617f70"} Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.016372 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfd7abcc5233b541d49f2172b9a9aea91d51f84507af6575671c098270617f70" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.016430 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-msv6h" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.132633 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv"] Jan 21 16:00:52 crc kubenswrapper[4773]: E0121 16:00:52.133090 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb2f7be-feb4-4081-a299-204701555c02" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.133117 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb2f7be-feb4-4081-a299-204701555c02" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.133377 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb2f7be-feb4-4081-a299-204701555c02" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.134307 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.136707 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.137116 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.137181 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.143192 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.156427 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv"] Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.256737 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.256808 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.256895 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmdfn\" (UniqueName: \"kubernetes.io/projected/7031589c-e137-46d5-afdf-77044617bfa2-kube-api-access-hmdfn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.359508 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.359620 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmdfn\" (UniqueName: \"kubernetes.io/projected/7031589c-e137-46d5-afdf-77044617bfa2-kube-api-access-hmdfn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.359788 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.363765 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.363784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.376353 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmdfn\" (UniqueName: \"kubernetes.io/projected/7031589c-e137-46d5-afdf-77044617bfa2-kube-api-access-hmdfn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:00:52 crc kubenswrapper[4773]: I0121 16:00:52.452561 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:00:53 crc kubenswrapper[4773]: I0121 16:00:53.011339 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv"] Jan 21 16:00:53 crc kubenswrapper[4773]: I0121 16:00:53.032405 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" event={"ID":"7031589c-e137-46d5-afdf-77044617bfa2","Type":"ContainerStarted","Data":"c0acebb147cc5ade4af918c52060e4ae672c125125f53e2fdf72c7d3836cdf13"} Jan 21 16:00:54 crc kubenswrapper[4773]: I0121 16:00:54.042036 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" event={"ID":"7031589c-e137-46d5-afdf-77044617bfa2","Type":"ContainerStarted","Data":"a5892b3262e665bf2538e8cb8080d373fe18c80ad80762daa69cd54005f570d0"} Jan 21 16:00:54 crc kubenswrapper[4773]: I0121 16:00:54.063253 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" podStartSLOduration=1.520390619 podStartE2EDuration="2.06323188s" podCreationTimestamp="2026-01-21 16:00:52 +0000 UTC" firstStartedPulling="2026-01-21 16:00:53.01460038 +0000 UTC m=+2217.939090002" lastFinishedPulling="2026-01-21 16:00:53.557441641 +0000 UTC m=+2218.481931263" observedRunningTime="2026-01-21 16:00:54.056140467 +0000 UTC m=+2218.980630089" watchObservedRunningTime="2026-01-21 16:00:54.06323188 +0000 UTC m=+2218.987721502" Jan 21 16:00:57 crc kubenswrapper[4773]: I0121 16:00:57.869556 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:57 crc kubenswrapper[4773]: I0121 16:00:57.870172 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:57 crc kubenswrapper[4773]: I0121 16:00:57.942956 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:58 crc kubenswrapper[4773]: I0121 16:00:58.145596 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:00:58 crc kubenswrapper[4773]: I0121 16:00:58.196416 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn878"] Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.114805 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zn878" podUID="8393ffe3-273e-4a3a-8f5c-434690d84cca" containerName="registry-server" containerID="cri-o://1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4" gracePeriod=2 Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.193887 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29483521-49spb"] Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.195281 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.203655 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29483521-49spb"] Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.289437 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-config-data\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.289484 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6z9c\" (UniqueName: \"kubernetes.io/projected/33ba1912-c7fc-40d8-b046-98d8d6e7931b-kube-api-access-v6z9c\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.289660 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-combined-ca-bundle\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.289684 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-fernet-keys\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.391608 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-config-data\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.392363 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6z9c\" (UniqueName: \"kubernetes.io/projected/33ba1912-c7fc-40d8-b046-98d8d6e7931b-kube-api-access-v6z9c\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.392614 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-combined-ca-bundle\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.392638 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-fernet-keys\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.399903 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-config-data\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.401416 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-fernet-keys\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.407134 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-combined-ca-bundle\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.410254 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6z9c\" (UniqueName: \"kubernetes.io/projected/33ba1912-c7fc-40d8-b046-98d8d6e7931b-kube-api-access-v6z9c\") pod \"keystone-cron-29483521-49spb\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.592893 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.700849 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.802619 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw27z\" (UniqueName: \"kubernetes.io/projected/8393ffe3-273e-4a3a-8f5c-434690d84cca-kube-api-access-zw27z\") pod \"8393ffe3-273e-4a3a-8f5c-434690d84cca\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.803578 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-catalog-content\") pod \"8393ffe3-273e-4a3a-8f5c-434690d84cca\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.803633 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-utilities\") pod \"8393ffe3-273e-4a3a-8f5c-434690d84cca\" (UID: \"8393ffe3-273e-4a3a-8f5c-434690d84cca\") " Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.805198 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-utilities" (OuterVolumeSpecName: "utilities") pod "8393ffe3-273e-4a3a-8f5c-434690d84cca" (UID: "8393ffe3-273e-4a3a-8f5c-434690d84cca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.807617 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8393ffe3-273e-4a3a-8f5c-434690d84cca-kube-api-access-zw27z" (OuterVolumeSpecName: "kube-api-access-zw27z") pod "8393ffe3-273e-4a3a-8f5c-434690d84cca" (UID: "8393ffe3-273e-4a3a-8f5c-434690d84cca"). InnerVolumeSpecName "kube-api-access-zw27z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.827194 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8393ffe3-273e-4a3a-8f5c-434690d84cca" (UID: "8393ffe3-273e-4a3a-8f5c-434690d84cca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.907371 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw27z\" (UniqueName: \"kubernetes.io/projected/8393ffe3-273e-4a3a-8f5c-434690d84cca-kube-api-access-zw27z\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.907438 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:00 crc kubenswrapper[4773]: I0121 16:01:00.907449 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8393ffe3-273e-4a3a-8f5c-434690d84cca-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:01 crc kubenswrapper[4773]: W0121 16:01:01.084546 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33ba1912_c7fc_40d8_b046_98d8d6e7931b.slice/crio-50f6437ad2d1d989f9e1ae10ae542eeb7a117a75d186eba3c018eef86e338df3 WatchSource:0}: Error finding container 50f6437ad2d1d989f9e1ae10ae542eeb7a117a75d186eba3c018eef86e338df3: Status 404 returned error can't find the container with id 50f6437ad2d1d989f9e1ae10ae542eeb7a117a75d186eba3c018eef86e338df3 Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.085479 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29483521-49spb"] Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.128888 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483521-49spb" event={"ID":"33ba1912-c7fc-40d8-b046-98d8d6e7931b","Type":"ContainerStarted","Data":"50f6437ad2d1d989f9e1ae10ae542eeb7a117a75d186eba3c018eef86e338df3"} Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.132533 4773 generic.go:334] "Generic (PLEG): container finished" podID="8393ffe3-273e-4a3a-8f5c-434690d84cca" containerID="1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4" exitCode=0 Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.132636 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zn878" Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.132620 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn878" event={"ID":"8393ffe3-273e-4a3a-8f5c-434690d84cca","Type":"ContainerDied","Data":"1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4"} Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.132918 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn878" event={"ID":"8393ffe3-273e-4a3a-8f5c-434690d84cca","Type":"ContainerDied","Data":"ebb8dd064f8b1d2e7b347333484b83eefedad5949d5557ac6312e31bb0e5982f"} Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.132963 4773 scope.go:117] "RemoveContainer" containerID="1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4" Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.166681 4773 scope.go:117] "RemoveContainer" containerID="99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866" Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.195473 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn878"] Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.200731 4773 scope.go:117] "RemoveContainer" containerID="3a91558b312de73768ef22e56f7d18c264b2fc06bd0847f7989c6d98ddb477c8" Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.213111 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn878"] Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.225571 4773 scope.go:117] "RemoveContainer" containerID="1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4" Jan 21 16:01:01 crc kubenswrapper[4773]: E0121 16:01:01.226011 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4\": container with ID starting with 1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4 not found: ID does not exist" containerID="1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4" Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.226044 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4"} err="failed to get container status \"1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4\": rpc error: code = NotFound desc = could not find container \"1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4\": container with ID starting with 1f0b51c544a709a80d9b3a4c7ef50c86c503cef7ca14cecf58af832ff5a75cd4 not found: ID does not exist" Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.226077 4773 scope.go:117] "RemoveContainer" containerID="99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866" Jan 21 16:01:01 crc kubenswrapper[4773]: E0121 16:01:01.226369 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866\": container with ID starting with 99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866 not found: ID does not exist" containerID="99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866" Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.226400 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866"} err="failed to get container status \"99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866\": rpc error: code = NotFound desc = could not find container \"99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866\": container with ID starting with 99a9a6c9b0a0160e48fe0eb511214051f20951c6d6630d3bb6699ded44d54866 not found: ID does not exist" Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.226416 4773 scope.go:117] "RemoveContainer" containerID="3a91558b312de73768ef22e56f7d18c264b2fc06bd0847f7989c6d98ddb477c8" Jan 21 16:01:01 crc kubenswrapper[4773]: E0121 16:01:01.226727 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a91558b312de73768ef22e56f7d18c264b2fc06bd0847f7989c6d98ddb477c8\": container with ID starting with 3a91558b312de73768ef22e56f7d18c264b2fc06bd0847f7989c6d98ddb477c8 not found: ID does not exist" containerID="3a91558b312de73768ef22e56f7d18c264b2fc06bd0847f7989c6d98ddb477c8" Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.226761 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a91558b312de73768ef22e56f7d18c264b2fc06bd0847f7989c6d98ddb477c8"} err="failed to get container status \"3a91558b312de73768ef22e56f7d18c264b2fc06bd0847f7989c6d98ddb477c8\": rpc error: code = NotFound desc = could not find container \"3a91558b312de73768ef22e56f7d18c264b2fc06bd0847f7989c6d98ddb477c8\": container with ID starting with 3a91558b312de73768ef22e56f7d18c264b2fc06bd0847f7989c6d98ddb477c8 not found: ID does not exist" Jan 21 16:01:01 crc kubenswrapper[4773]: I0121 16:01:01.397327 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8393ffe3-273e-4a3a-8f5c-434690d84cca" path="/var/lib/kubelet/pods/8393ffe3-273e-4a3a-8f5c-434690d84cca/volumes" Jan 21 16:01:02 crc kubenswrapper[4773]: I0121 16:01:02.146564 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483521-49spb" event={"ID":"33ba1912-c7fc-40d8-b046-98d8d6e7931b","Type":"ContainerStarted","Data":"b4ef86cc4a455e1fd87a789b3cbe3a5f5419ae2558a5bdcf7e9809a42145420f"} Jan 21 16:01:02 crc kubenswrapper[4773]: I0121 16:01:02.169741 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29483521-49spb" podStartSLOduration=2.169724495 podStartE2EDuration="2.169724495s" podCreationTimestamp="2026-01-21 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:01:02.163871066 +0000 UTC m=+2227.088360678" watchObservedRunningTime="2026-01-21 16:01:02.169724495 +0000 UTC m=+2227.094214117" Jan 21 16:01:05 crc kubenswrapper[4773]: I0121 16:01:05.184180 4773 generic.go:334] "Generic (PLEG): container finished" podID="33ba1912-c7fc-40d8-b046-98d8d6e7931b" containerID="b4ef86cc4a455e1fd87a789b3cbe3a5f5419ae2558a5bdcf7e9809a42145420f" exitCode=0 Jan 21 16:01:05 crc kubenswrapper[4773]: I0121 16:01:05.184229 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483521-49spb" event={"ID":"33ba1912-c7fc-40d8-b046-98d8d6e7931b","Type":"ContainerDied","Data":"b4ef86cc4a455e1fd87a789b3cbe3a5f5419ae2558a5bdcf7e9809a42145420f"} Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.642329 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.765251 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-combined-ca-bundle\") pod \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.765338 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6z9c\" (UniqueName: \"kubernetes.io/projected/33ba1912-c7fc-40d8-b046-98d8d6e7931b-kube-api-access-v6z9c\") pod \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.765374 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-config-data\") pod \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.765451 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-fernet-keys\") pod \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\" (UID: \"33ba1912-c7fc-40d8-b046-98d8d6e7931b\") " Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.777842 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "33ba1912-c7fc-40d8-b046-98d8d6e7931b" (UID: "33ba1912-c7fc-40d8-b046-98d8d6e7931b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.777939 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ba1912-c7fc-40d8-b046-98d8d6e7931b-kube-api-access-v6z9c" (OuterVolumeSpecName: "kube-api-access-v6z9c") pod "33ba1912-c7fc-40d8-b046-98d8d6e7931b" (UID: "33ba1912-c7fc-40d8-b046-98d8d6e7931b"). InnerVolumeSpecName "kube-api-access-v6z9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.802684 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33ba1912-c7fc-40d8-b046-98d8d6e7931b" (UID: "33ba1912-c7fc-40d8-b046-98d8d6e7931b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.834956 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-config-data" (OuterVolumeSpecName: "config-data") pod "33ba1912-c7fc-40d8-b046-98d8d6e7931b" (UID: "33ba1912-c7fc-40d8-b046-98d8d6e7931b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.868769 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.868805 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.868817 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6z9c\" (UniqueName: \"kubernetes.io/projected/33ba1912-c7fc-40d8-b046-98d8d6e7931b-kube-api-access-v6z9c\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:06 crc kubenswrapper[4773]: I0121 16:01:06.868827 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ba1912-c7fc-40d8-b046-98d8d6e7931b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:07 crc kubenswrapper[4773]: I0121 16:01:07.205003 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483521-49spb" event={"ID":"33ba1912-c7fc-40d8-b046-98d8d6e7931b","Type":"ContainerDied","Data":"50f6437ad2d1d989f9e1ae10ae542eeb7a117a75d186eba3c018eef86e338df3"} Jan 21 16:01:07 crc kubenswrapper[4773]: I0121 16:01:07.205049 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50f6437ad2d1d989f9e1ae10ae542eeb7a117a75d186eba3c018eef86e338df3" Jan 21 16:01:07 crc kubenswrapper[4773]: I0121 16:01:07.205108 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483521-49spb" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.077324 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n67dm"] Jan 21 16:01:12 crc kubenswrapper[4773]: E0121 16:01:12.078465 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8393ffe3-273e-4a3a-8f5c-434690d84cca" containerName="extract-utilities" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.078484 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8393ffe3-273e-4a3a-8f5c-434690d84cca" containerName="extract-utilities" Jan 21 16:01:12 crc kubenswrapper[4773]: E0121 16:01:12.078517 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8393ffe3-273e-4a3a-8f5c-434690d84cca" containerName="registry-server" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.078526 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8393ffe3-273e-4a3a-8f5c-434690d84cca" containerName="registry-server" Jan 21 16:01:12 crc kubenswrapper[4773]: E0121 16:01:12.078543 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ba1912-c7fc-40d8-b046-98d8d6e7931b" containerName="keystone-cron" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.078554 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ba1912-c7fc-40d8-b046-98d8d6e7931b" containerName="keystone-cron" Jan 21 16:01:12 crc kubenswrapper[4773]: E0121 16:01:12.078572 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8393ffe3-273e-4a3a-8f5c-434690d84cca" containerName="extract-content" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.078579 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8393ffe3-273e-4a3a-8f5c-434690d84cca" containerName="extract-content" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.078827 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8393ffe3-273e-4a3a-8f5c-434690d84cca" containerName="registry-server" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.078849 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ba1912-c7fc-40d8-b046-98d8d6e7931b" containerName="keystone-cron" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.080810 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.109372 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n67dm"] Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.168410 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjqs8\" (UniqueName: \"kubernetes.io/projected/20c7af43-5df2-45ce-8702-a1b69c88b5a2-kube-api-access-zjqs8\") pod \"community-operators-n67dm\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.168473 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-catalog-content\") pod \"community-operators-n67dm\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.168553 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-utilities\") pod \"community-operators-n67dm\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.270100 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjqs8\" (UniqueName: \"kubernetes.io/projected/20c7af43-5df2-45ce-8702-a1b69c88b5a2-kube-api-access-zjqs8\") pod \"community-operators-n67dm\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.270150 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-catalog-content\") pod \"community-operators-n67dm\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.270224 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-utilities\") pod \"community-operators-n67dm\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.270902 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-utilities\") pod \"community-operators-n67dm\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.271024 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-catalog-content\") pod \"community-operators-n67dm\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.292773 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjqs8\" (UniqueName: \"kubernetes.io/projected/20c7af43-5df2-45ce-8702-a1b69c88b5a2-kube-api-access-zjqs8\") pod \"community-operators-n67dm\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.406785 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:12 crc kubenswrapper[4773]: I0121 16:01:12.988880 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n67dm"] Jan 21 16:01:13 crc kubenswrapper[4773]: I0121 16:01:13.262789 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n67dm" event={"ID":"20c7af43-5df2-45ce-8702-a1b69c88b5a2","Type":"ContainerStarted","Data":"a96830c86a1cc8e2693c99b4233ca1254d595026642b844bf11ce89e0c0cf4de"} Jan 21 16:01:13 crc kubenswrapper[4773]: I0121 16:01:13.264865 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n67dm" event={"ID":"20c7af43-5df2-45ce-8702-a1b69c88b5a2","Type":"ContainerStarted","Data":"0f5f2c3eca3ec3476e7bd5045c5eb3b4691080c6b8b94e5640a5e8aa712b2f06"} Jan 21 16:01:14 crc kubenswrapper[4773]: I0121 16:01:14.272013 4773 generic.go:334] "Generic (PLEG): container finished" podID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" containerID="a96830c86a1cc8e2693c99b4233ca1254d595026642b844bf11ce89e0c0cf4de" exitCode=0 Jan 21 16:01:14 crc kubenswrapper[4773]: I0121 16:01:14.272089 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n67dm" event={"ID":"20c7af43-5df2-45ce-8702-a1b69c88b5a2","Type":"ContainerDied","Data":"a96830c86a1cc8e2693c99b4233ca1254d595026642b844bf11ce89e0c0cf4de"} Jan 21 16:01:16 crc kubenswrapper[4773]: I0121 16:01:16.291939 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n67dm" event={"ID":"20c7af43-5df2-45ce-8702-a1b69c88b5a2","Type":"ContainerStarted","Data":"fcb56e1e577b9a29d4104b4cc811116ffb3f711c5be759c36423352dc752695f"} Jan 21 16:01:17 crc kubenswrapper[4773]: I0121 16:01:17.303216 4773 generic.go:334] "Generic (PLEG): container finished" podID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" containerID="fcb56e1e577b9a29d4104b4cc811116ffb3f711c5be759c36423352dc752695f" exitCode=0 Jan 21 16:01:17 crc kubenswrapper[4773]: I0121 16:01:17.303266 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n67dm" event={"ID":"20c7af43-5df2-45ce-8702-a1b69c88b5a2","Type":"ContainerDied","Data":"fcb56e1e577b9a29d4104b4cc811116ffb3f711c5be759c36423352dc752695f"} Jan 21 16:01:20 crc kubenswrapper[4773]: I0121 16:01:20.336212 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n67dm" event={"ID":"20c7af43-5df2-45ce-8702-a1b69c88b5a2","Type":"ContainerStarted","Data":"831429d8ba5f58d2ac48cfb5ebf75d78820042b23709a9ce3c15cb66a84ef83b"} Jan 21 16:01:20 crc kubenswrapper[4773]: I0121 16:01:20.363652 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n67dm" podStartSLOduration=3.326277009 podStartE2EDuration="8.36363485s" podCreationTimestamp="2026-01-21 16:01:12 +0000 UTC" firstStartedPulling="2026-01-21 16:01:13.267381677 +0000 UTC m=+2238.191871299" lastFinishedPulling="2026-01-21 16:01:18.304739508 +0000 UTC m=+2243.229229140" observedRunningTime="2026-01-21 16:01:20.354500461 +0000 UTC m=+2245.278990083" watchObservedRunningTime="2026-01-21 16:01:20.36363485 +0000 UTC m=+2245.288124472" Jan 21 16:01:21 crc kubenswrapper[4773]: I0121 16:01:21.828311 4773 scope.go:117] "RemoveContainer" containerID="997a78c8b2f79aa75b335c354cd6fcb926298a3d056aeba12a5a77a363706460" Jan 21 16:01:22 crc kubenswrapper[4773]: I0121 16:01:22.407481 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:22 crc kubenswrapper[4773]: I0121 16:01:22.407517 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:22 crc kubenswrapper[4773]: I0121 16:01:22.460416 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:25 crc kubenswrapper[4773]: I0121 16:01:25.205576 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:01:25 crc kubenswrapper[4773]: I0121 16:01:25.206340 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:01:32 crc kubenswrapper[4773]: I0121 16:01:32.456402 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:32 crc kubenswrapper[4773]: I0121 16:01:32.510054 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n67dm"] Jan 21 16:01:33 crc kubenswrapper[4773]: I0121 16:01:33.463629 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n67dm" podUID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" containerName="registry-server" containerID="cri-o://831429d8ba5f58d2ac48cfb5ebf75d78820042b23709a9ce3c15cb66a84ef83b" gracePeriod=2 Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.478136 4773 generic.go:334] "Generic (PLEG): container finished" podID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" containerID="831429d8ba5f58d2ac48cfb5ebf75d78820042b23709a9ce3c15cb66a84ef83b" exitCode=0 Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.478266 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n67dm" event={"ID":"20c7af43-5df2-45ce-8702-a1b69c88b5a2","Type":"ContainerDied","Data":"831429d8ba5f58d2ac48cfb5ebf75d78820042b23709a9ce3c15cb66a84ef83b"} Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.478786 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n67dm" event={"ID":"20c7af43-5df2-45ce-8702-a1b69c88b5a2","Type":"ContainerDied","Data":"0f5f2c3eca3ec3476e7bd5045c5eb3b4691080c6b8b94e5640a5e8aa712b2f06"} Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.478808 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f5f2c3eca3ec3476e7bd5045c5eb3b4691080c6b8b94e5640a5e8aa712b2f06" Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.513955 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.666974 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjqs8\" (UniqueName: \"kubernetes.io/projected/20c7af43-5df2-45ce-8702-a1b69c88b5a2-kube-api-access-zjqs8\") pod \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.667095 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-utilities\") pod \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.667248 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-catalog-content\") pod \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\" (UID: \"20c7af43-5df2-45ce-8702-a1b69c88b5a2\") " Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.671720 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-utilities" (OuterVolumeSpecName: "utilities") pod "20c7af43-5df2-45ce-8702-a1b69c88b5a2" (UID: "20c7af43-5df2-45ce-8702-a1b69c88b5a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.680974 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c7af43-5df2-45ce-8702-a1b69c88b5a2-kube-api-access-zjqs8" (OuterVolumeSpecName: "kube-api-access-zjqs8") pod "20c7af43-5df2-45ce-8702-a1b69c88b5a2" (UID: "20c7af43-5df2-45ce-8702-a1b69c88b5a2"). InnerVolumeSpecName "kube-api-access-zjqs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.721741 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20c7af43-5df2-45ce-8702-a1b69c88b5a2" (UID: "20c7af43-5df2-45ce-8702-a1b69c88b5a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.770246 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjqs8\" (UniqueName: \"kubernetes.io/projected/20c7af43-5df2-45ce-8702-a1b69c88b5a2-kube-api-access-zjqs8\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.770288 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:34 crc kubenswrapper[4773]: I0121 16:01:34.770299 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c7af43-5df2-45ce-8702-a1b69c88b5a2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:35 crc kubenswrapper[4773]: I0121 16:01:35.487118 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n67dm" Jan 21 16:01:35 crc kubenswrapper[4773]: I0121 16:01:35.517905 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n67dm"] Jan 21 16:01:35 crc kubenswrapper[4773]: I0121 16:01:35.526778 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n67dm"] Jan 21 16:01:37 crc kubenswrapper[4773]: I0121 16:01:37.397612 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" path="/var/lib/kubelet/pods/20c7af43-5df2-45ce-8702-a1b69c88b5a2/volumes" Jan 21 16:01:41 crc kubenswrapper[4773]: I0121 16:01:41.060658 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-j7nrq"] Jan 21 16:01:41 crc kubenswrapper[4773]: I0121 16:01:41.070873 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-j7nrq"] Jan 21 16:01:41 crc kubenswrapper[4773]: I0121 16:01:41.398157 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6481770-7fe3-45bf-8e7b-18ca325f1a6d" path="/var/lib/kubelet/pods/e6481770-7fe3-45bf-8e7b-18ca325f1a6d/volumes" Jan 21 16:01:45 crc kubenswrapper[4773]: I0121 16:01:45.586887 4773 generic.go:334] "Generic (PLEG): container finished" podID="7031589c-e137-46d5-afdf-77044617bfa2" containerID="a5892b3262e665bf2538e8cb8080d373fe18c80ad80762daa69cd54005f570d0" exitCode=0 Jan 21 16:01:45 crc kubenswrapper[4773]: I0121 16:01:45.586982 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" event={"ID":"7031589c-e137-46d5-afdf-77044617bfa2","Type":"ContainerDied","Data":"a5892b3262e665bf2538e8cb8080d373fe18c80ad80762daa69cd54005f570d0"} Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.127349 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.231662 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-ssh-key-openstack-edpm-ipam\") pod \"7031589c-e137-46d5-afdf-77044617bfa2\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.231891 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmdfn\" (UniqueName: \"kubernetes.io/projected/7031589c-e137-46d5-afdf-77044617bfa2-kube-api-access-hmdfn\") pod \"7031589c-e137-46d5-afdf-77044617bfa2\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.232026 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-inventory\") pod \"7031589c-e137-46d5-afdf-77044617bfa2\" (UID: \"7031589c-e137-46d5-afdf-77044617bfa2\") " Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.237169 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7031589c-e137-46d5-afdf-77044617bfa2-kube-api-access-hmdfn" (OuterVolumeSpecName: "kube-api-access-hmdfn") pod "7031589c-e137-46d5-afdf-77044617bfa2" (UID: "7031589c-e137-46d5-afdf-77044617bfa2"). InnerVolumeSpecName "kube-api-access-hmdfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.260194 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7031589c-e137-46d5-afdf-77044617bfa2" (UID: "7031589c-e137-46d5-afdf-77044617bfa2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.265412 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-inventory" (OuterVolumeSpecName: "inventory") pod "7031589c-e137-46d5-afdf-77044617bfa2" (UID: "7031589c-e137-46d5-afdf-77044617bfa2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.335313 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.335354 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmdfn\" (UniqueName: \"kubernetes.io/projected/7031589c-e137-46d5-afdf-77044617bfa2-kube-api-access-hmdfn\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.335366 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7031589c-e137-46d5-afdf-77044617bfa2-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.612041 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" event={"ID":"7031589c-e137-46d5-afdf-77044617bfa2","Type":"ContainerDied","Data":"c0acebb147cc5ade4af918c52060e4ae672c125125f53e2fdf72c7d3836cdf13"} Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.612386 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0acebb147cc5ade4af918c52060e4ae672c125125f53e2fdf72c7d3836cdf13" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.612481 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.722280 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9xcxn"] Jan 21 16:01:47 crc kubenswrapper[4773]: E0121 16:01:47.722821 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" containerName="registry-server" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.722841 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" containerName="registry-server" Jan 21 16:01:47 crc kubenswrapper[4773]: E0121 16:01:47.722853 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" containerName="extract-content" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.722868 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" containerName="extract-content" Jan 21 16:01:47 crc kubenswrapper[4773]: E0121 16:01:47.722884 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" containerName="extract-utilities" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.722891 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" containerName="extract-utilities" Jan 21 16:01:47 crc kubenswrapper[4773]: E0121 16:01:47.722901 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7031589c-e137-46d5-afdf-77044617bfa2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.722908 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7031589c-e137-46d5-afdf-77044617bfa2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.723116 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7031589c-e137-46d5-afdf-77044617bfa2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.723130 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c7af43-5df2-45ce-8702-a1b69c88b5a2" containerName="registry-server" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.724057 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.729107 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.729262 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.729461 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.729639 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.736009 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9xcxn"] Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.854840 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9xcxn\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.855446 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9xcxn\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.855560 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4bgl\" (UniqueName: \"kubernetes.io/projected/acf5cb49-230b-4c79-b383-3ea958daeede-kube-api-access-z4bgl\") pod \"ssh-known-hosts-edpm-deployment-9xcxn\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.957612 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9xcxn\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.957691 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4bgl\" (UniqueName: \"kubernetes.io/projected/acf5cb49-230b-4c79-b383-3ea958daeede-kube-api-access-z4bgl\") pod \"ssh-known-hosts-edpm-deployment-9xcxn\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.957842 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9xcxn\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.962451 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9xcxn\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.962643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9xcxn\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:47 crc kubenswrapper[4773]: I0121 16:01:47.989927 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4bgl\" (UniqueName: \"kubernetes.io/projected/acf5cb49-230b-4c79-b383-3ea958daeede-kube-api-access-z4bgl\") pod \"ssh-known-hosts-edpm-deployment-9xcxn\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:48 crc kubenswrapper[4773]: I0121 16:01:48.065772 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:48 crc kubenswrapper[4773]: I0121 16:01:48.633502 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9xcxn"] Jan 21 16:01:48 crc kubenswrapper[4773]: I0121 16:01:48.639665 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:01:49 crc kubenswrapper[4773]: I0121 16:01:49.044606 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-5vvkf"] Jan 21 16:01:49 crc kubenswrapper[4773]: I0121 16:01:49.054865 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-5vvkf"] Jan 21 16:01:49 crc kubenswrapper[4773]: I0121 16:01:49.402370 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15296d45-0901-4f95-b397-841dc24e8a08" path="/var/lib/kubelet/pods/15296d45-0901-4f95-b397-841dc24e8a08/volumes" Jan 21 16:01:49 crc kubenswrapper[4773]: I0121 16:01:49.634362 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" event={"ID":"acf5cb49-230b-4c79-b383-3ea958daeede","Type":"ContainerStarted","Data":"3ea37e30b10615877ccf0b6159ffa62aabebe7f17b0863987de667f3cbc3c0d9"} Jan 21 16:01:49 crc kubenswrapper[4773]: I0121 16:01:49.634671 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" event={"ID":"acf5cb49-230b-4c79-b383-3ea958daeede","Type":"ContainerStarted","Data":"654cb7860f8509950c7d4513f314626e962b5717daa7f17a13fa412f354f830a"} Jan 21 16:01:49 crc kubenswrapper[4773]: I0121 16:01:49.669555 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" podStartSLOduration=2.044073322 podStartE2EDuration="2.669531078s" podCreationTimestamp="2026-01-21 16:01:47 +0000 UTC" firstStartedPulling="2026-01-21 16:01:48.6394672 +0000 UTC m=+2273.563956822" lastFinishedPulling="2026-01-21 16:01:49.264924956 +0000 UTC m=+2274.189414578" observedRunningTime="2026-01-21 16:01:49.657389656 +0000 UTC m=+2274.581879278" watchObservedRunningTime="2026-01-21 16:01:49.669531078 +0000 UTC m=+2274.594020700" Jan 21 16:01:55 crc kubenswrapper[4773]: I0121 16:01:55.205487 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:01:55 crc kubenswrapper[4773]: I0121 16:01:55.206254 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:01:57 crc kubenswrapper[4773]: I0121 16:01:57.711618 4773 generic.go:334] "Generic (PLEG): container finished" podID="acf5cb49-230b-4c79-b383-3ea958daeede" containerID="3ea37e30b10615877ccf0b6159ffa62aabebe7f17b0863987de667f3cbc3c0d9" exitCode=0 Jan 21 16:01:57 crc kubenswrapper[4773]: I0121 16:01:57.711710 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" event={"ID":"acf5cb49-230b-4c79-b383-3ea958daeede","Type":"ContainerDied","Data":"3ea37e30b10615877ccf0b6159ffa62aabebe7f17b0863987de667f3cbc3c0d9"} Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.267135 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.430931 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-inventory-0\") pod \"acf5cb49-230b-4c79-b383-3ea958daeede\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.431065 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-ssh-key-openstack-edpm-ipam\") pod \"acf5cb49-230b-4c79-b383-3ea958daeede\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.431135 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4bgl\" (UniqueName: \"kubernetes.io/projected/acf5cb49-230b-4c79-b383-3ea958daeede-kube-api-access-z4bgl\") pod \"acf5cb49-230b-4c79-b383-3ea958daeede\" (UID: \"acf5cb49-230b-4c79-b383-3ea958daeede\") " Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.437235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acf5cb49-230b-4c79-b383-3ea958daeede-kube-api-access-z4bgl" (OuterVolumeSpecName: "kube-api-access-z4bgl") pod "acf5cb49-230b-4c79-b383-3ea958daeede" (UID: "acf5cb49-230b-4c79-b383-3ea958daeede"). InnerVolumeSpecName "kube-api-access-z4bgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.461277 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "acf5cb49-230b-4c79-b383-3ea958daeede" (UID: "acf5cb49-230b-4c79-b383-3ea958daeede"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.462555 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "acf5cb49-230b-4c79-b383-3ea958daeede" (UID: "acf5cb49-230b-4c79-b383-3ea958daeede"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.534354 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.534389 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4bgl\" (UniqueName: \"kubernetes.io/projected/acf5cb49-230b-4c79-b383-3ea958daeede-kube-api-access-z4bgl\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.534403 4773 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/acf5cb49-230b-4c79-b383-3ea958daeede-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.735370 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" event={"ID":"acf5cb49-230b-4c79-b383-3ea958daeede","Type":"ContainerDied","Data":"654cb7860f8509950c7d4513f314626e962b5717daa7f17a13fa412f354f830a"} Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.735412 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="654cb7860f8509950c7d4513f314626e962b5717daa7f17a13fa412f354f830a" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.735467 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9xcxn" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.846488 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd"] Jan 21 16:01:59 crc kubenswrapper[4773]: E0121 16:01:59.847042 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf5cb49-230b-4c79-b383-3ea958daeede" containerName="ssh-known-hosts-edpm-deployment" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.847068 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf5cb49-230b-4c79-b383-3ea958daeede" containerName="ssh-known-hosts-edpm-deployment" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.847418 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf5cb49-230b-4c79-b383-3ea958daeede" containerName="ssh-known-hosts-edpm-deployment" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.848356 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.864944 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.870038 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd"] Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.871078 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.871159 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.871081 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.945102 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zmlnd\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.945229 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zmlnd\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:01:59 crc kubenswrapper[4773]: I0121 16:01:59.945257 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjldl\" (UniqueName: \"kubernetes.io/projected/545811bf-853d-41fb-847b-8a483a017894-kube-api-access-bjldl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zmlnd\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:02:00 crc kubenswrapper[4773]: I0121 16:02:00.048564 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zmlnd\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:02:00 crc kubenswrapper[4773]: I0121 16:02:00.049034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjldl\" (UniqueName: \"kubernetes.io/projected/545811bf-853d-41fb-847b-8a483a017894-kube-api-access-bjldl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zmlnd\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:02:00 crc kubenswrapper[4773]: I0121 16:02:00.049314 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zmlnd\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:02:00 crc kubenswrapper[4773]: I0121 16:02:00.062851 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zmlnd\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:02:00 crc kubenswrapper[4773]: I0121 16:02:00.067344 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zmlnd\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:02:00 crc kubenswrapper[4773]: I0121 16:02:00.095620 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjldl\" (UniqueName: \"kubernetes.io/projected/545811bf-853d-41fb-847b-8a483a017894-kube-api-access-bjldl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zmlnd\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:02:00 crc kubenswrapper[4773]: I0121 16:02:00.176358 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:02:00 crc kubenswrapper[4773]: I0121 16:02:00.742384 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd"] Jan 21 16:02:01 crc kubenswrapper[4773]: I0121 16:02:01.759384 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" event={"ID":"545811bf-853d-41fb-847b-8a483a017894","Type":"ContainerStarted","Data":"3730d6b809460f4a2a374e877546be4da32fd6f2bf7ab42c7e0ac93082551641"} Jan 21 16:02:02 crc kubenswrapper[4773]: I0121 16:02:02.784259 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" event={"ID":"545811bf-853d-41fb-847b-8a483a017894","Type":"ContainerStarted","Data":"ee6fd9a56ca152c0997a7c35cf5d9cf9e95d350bd82a4ef99e34727d2eb13e7d"} Jan 21 16:02:02 crc kubenswrapper[4773]: I0121 16:02:02.806084 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" podStartSLOduration=2.930078213 podStartE2EDuration="3.806059443s" podCreationTimestamp="2026-01-21 16:01:59 +0000 UTC" firstStartedPulling="2026-01-21 16:02:00.749975337 +0000 UTC m=+2285.674464959" lastFinishedPulling="2026-01-21 16:02:01.625956567 +0000 UTC m=+2286.550446189" observedRunningTime="2026-01-21 16:02:02.799384721 +0000 UTC m=+2287.723874343" watchObservedRunningTime="2026-01-21 16:02:02.806059443 +0000 UTC m=+2287.730549055" Jan 21 16:02:11 crc kubenswrapper[4773]: I0121 16:02:11.875547 4773 generic.go:334] "Generic (PLEG): container finished" podID="545811bf-853d-41fb-847b-8a483a017894" containerID="ee6fd9a56ca152c0997a7c35cf5d9cf9e95d350bd82a4ef99e34727d2eb13e7d" exitCode=0 Jan 21 16:02:11 crc kubenswrapper[4773]: I0121 16:02:11.875637 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" event={"ID":"545811bf-853d-41fb-847b-8a483a017894","Type":"ContainerDied","Data":"ee6fd9a56ca152c0997a7c35cf5d9cf9e95d350bd82a4ef99e34727d2eb13e7d"} Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.369443 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.444025 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-ssh-key-openstack-edpm-ipam\") pod \"545811bf-853d-41fb-847b-8a483a017894\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.444190 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjldl\" (UniqueName: \"kubernetes.io/projected/545811bf-853d-41fb-847b-8a483a017894-kube-api-access-bjldl\") pod \"545811bf-853d-41fb-847b-8a483a017894\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.444234 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-inventory\") pod \"545811bf-853d-41fb-847b-8a483a017894\" (UID: \"545811bf-853d-41fb-847b-8a483a017894\") " Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.449945 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/545811bf-853d-41fb-847b-8a483a017894-kube-api-access-bjldl" (OuterVolumeSpecName: "kube-api-access-bjldl") pod "545811bf-853d-41fb-847b-8a483a017894" (UID: "545811bf-853d-41fb-847b-8a483a017894"). InnerVolumeSpecName "kube-api-access-bjldl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.474233 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "545811bf-853d-41fb-847b-8a483a017894" (UID: "545811bf-853d-41fb-847b-8a483a017894"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.476805 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-inventory" (OuterVolumeSpecName: "inventory") pod "545811bf-853d-41fb-847b-8a483a017894" (UID: "545811bf-853d-41fb-847b-8a483a017894"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.547293 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.547335 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjldl\" (UniqueName: \"kubernetes.io/projected/545811bf-853d-41fb-847b-8a483a017894-kube-api-access-bjldl\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.547347 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/545811bf-853d-41fb-847b-8a483a017894-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.893670 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" event={"ID":"545811bf-853d-41fb-847b-8a483a017894","Type":"ContainerDied","Data":"3730d6b809460f4a2a374e877546be4da32fd6f2bf7ab42c7e0ac93082551641"} Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.893789 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3730d6b809460f4a2a374e877546be4da32fd6f2bf7ab42c7e0ac93082551641" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.893847 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zmlnd" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.969452 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg"] Jan 21 16:02:13 crc kubenswrapper[4773]: E0121 16:02:13.971513 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545811bf-853d-41fb-847b-8a483a017894" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.971547 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="545811bf-853d-41fb-847b-8a483a017894" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.971822 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="545811bf-853d-41fb-847b-8a483a017894" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.972613 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.975209 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.975259 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.975287 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.975835 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:02:13 crc kubenswrapper[4773]: I0121 16:02:13.985324 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg"] Jan 21 16:02:14 crc kubenswrapper[4773]: I0121 16:02:14.057484 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcv2l\" (UniqueName: \"kubernetes.io/projected/5d404546-874a-474a-ac90-b6be34ed0420-kube-api-access-zcv2l\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:14 crc kubenswrapper[4773]: I0121 16:02:14.057636 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:14 crc kubenswrapper[4773]: I0121 16:02:14.057926 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:14 crc kubenswrapper[4773]: I0121 16:02:14.159857 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:14 crc kubenswrapper[4773]: I0121 16:02:14.159951 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:14 crc kubenswrapper[4773]: I0121 16:02:14.160087 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcv2l\" (UniqueName: \"kubernetes.io/projected/5d404546-874a-474a-ac90-b6be34ed0420-kube-api-access-zcv2l\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:14 crc kubenswrapper[4773]: I0121 16:02:14.166580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:14 crc kubenswrapper[4773]: I0121 16:02:14.169533 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:14 crc kubenswrapper[4773]: I0121 16:02:14.182245 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcv2l\" (UniqueName: \"kubernetes.io/projected/5d404546-874a-474a-ac90-b6be34ed0420-kube-api-access-zcv2l\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:14 crc kubenswrapper[4773]: I0121 16:02:14.299967 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:14 crc kubenswrapper[4773]: I0121 16:02:14.919284 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg"] Jan 21 16:02:15 crc kubenswrapper[4773]: I0121 16:02:15.915487 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" event={"ID":"5d404546-874a-474a-ac90-b6be34ed0420","Type":"ContainerStarted","Data":"a4c74ae6eb56204e81a55ad899b7311baab9c5b65a348ca0d991f52501352b46"} Jan 21 16:02:16 crc kubenswrapper[4773]: I0121 16:02:16.936585 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" event={"ID":"5d404546-874a-474a-ac90-b6be34ed0420","Type":"ContainerStarted","Data":"c7d14dac043ac42028dc7ad743e1810250af0a70c7f5ef94870eff77a74de3fa"} Jan 21 16:02:16 crc kubenswrapper[4773]: I0121 16:02:16.961647 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" podStartSLOduration=3.130535171 podStartE2EDuration="3.961623634s" podCreationTimestamp="2026-01-21 16:02:13 +0000 UTC" firstStartedPulling="2026-01-21 16:02:14.939927128 +0000 UTC m=+2299.864416750" lastFinishedPulling="2026-01-21 16:02:15.771015591 +0000 UTC m=+2300.695505213" observedRunningTime="2026-01-21 16:02:16.95635428 +0000 UTC m=+2301.880843902" watchObservedRunningTime="2026-01-21 16:02:16.961623634 +0000 UTC m=+2301.886113256" Jan 21 16:02:21 crc kubenswrapper[4773]: I0121 16:02:21.937794 4773 scope.go:117] "RemoveContainer" containerID="6b87c48c14a097a1f64db9734b0f9800d89972e9e28e2c15b6b141c87c13262a" Jan 21 16:02:21 crc kubenswrapper[4773]: I0121 16:02:21.968237 4773 scope.go:117] "RemoveContainer" containerID="4c1b9e99d4d1a77f6f4c818545f52e8f4f4a18fde8a2b8c17f5988d7d732244d" Jan 21 16:02:25 crc kubenswrapper[4773]: I0121 16:02:25.205634 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:02:25 crc kubenswrapper[4773]: I0121 16:02:25.206245 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:02:25 crc kubenswrapper[4773]: I0121 16:02:25.206302 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 16:02:25 crc kubenswrapper[4773]: I0121 16:02:25.207213 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:02:25 crc kubenswrapper[4773]: I0121 16:02:25.207281 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" gracePeriod=600 Jan 21 16:02:27 crc kubenswrapper[4773]: E0121 16:02:27.219628 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:02:28 crc kubenswrapper[4773]: I0121 16:02:28.043340 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" exitCode=0 Jan 21 16:02:28 crc kubenswrapper[4773]: I0121 16:02:28.043428 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8"} Jan 21 16:02:28 crc kubenswrapper[4773]: I0121 16:02:28.043698 4773 scope.go:117] "RemoveContainer" containerID="17e7142fafa8e0349bab7f85c21294d342dcff1249216c34be3507a9473a3db2" Jan 21 16:02:28 crc kubenswrapper[4773]: I0121 16:02:28.044447 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:02:28 crc kubenswrapper[4773]: E0121 16:02:28.044788 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:02:28 crc kubenswrapper[4773]: I0121 16:02:28.049063 4773 generic.go:334] "Generic (PLEG): container finished" podID="5d404546-874a-474a-ac90-b6be34ed0420" containerID="c7d14dac043ac42028dc7ad743e1810250af0a70c7f5ef94870eff77a74de3fa" exitCode=0 Jan 21 16:02:28 crc kubenswrapper[4773]: I0121 16:02:28.049108 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" event={"ID":"5d404546-874a-474a-ac90-b6be34ed0420","Type":"ContainerDied","Data":"c7d14dac043ac42028dc7ad743e1810250af0a70c7f5ef94870eff77a74de3fa"} Jan 21 16:02:29 crc kubenswrapper[4773]: I0121 16:02:29.593901 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:29 crc kubenswrapper[4773]: I0121 16:02:29.699301 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-inventory\") pod \"5d404546-874a-474a-ac90-b6be34ed0420\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " Jan 21 16:02:29 crc kubenswrapper[4773]: I0121 16:02:29.699365 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-ssh-key-openstack-edpm-ipam\") pod \"5d404546-874a-474a-ac90-b6be34ed0420\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " Jan 21 16:02:29 crc kubenswrapper[4773]: I0121 16:02:29.699426 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcv2l\" (UniqueName: \"kubernetes.io/projected/5d404546-874a-474a-ac90-b6be34ed0420-kube-api-access-zcv2l\") pod \"5d404546-874a-474a-ac90-b6be34ed0420\" (UID: \"5d404546-874a-474a-ac90-b6be34ed0420\") " Jan 21 16:02:29 crc kubenswrapper[4773]: I0121 16:02:29.705119 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d404546-874a-474a-ac90-b6be34ed0420-kube-api-access-zcv2l" (OuterVolumeSpecName: "kube-api-access-zcv2l") pod "5d404546-874a-474a-ac90-b6be34ed0420" (UID: "5d404546-874a-474a-ac90-b6be34ed0420"). InnerVolumeSpecName "kube-api-access-zcv2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:02:29 crc kubenswrapper[4773]: I0121 16:02:29.736035 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-inventory" (OuterVolumeSpecName: "inventory") pod "5d404546-874a-474a-ac90-b6be34ed0420" (UID: "5d404546-874a-474a-ac90-b6be34ed0420"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:29 crc kubenswrapper[4773]: I0121 16:02:29.739366 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5d404546-874a-474a-ac90-b6be34ed0420" (UID: "5d404546-874a-474a-ac90-b6be34ed0420"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:02:29 crc kubenswrapper[4773]: I0121 16:02:29.807453 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:29 crc kubenswrapper[4773]: I0121 16:02:29.807485 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d404546-874a-474a-ac90-b6be34ed0420-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:29 crc kubenswrapper[4773]: I0121 16:02:29.807497 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcv2l\" (UniqueName: \"kubernetes.io/projected/5d404546-874a-474a-ac90-b6be34ed0420-kube-api-access-zcv2l\") on node \"crc\" DevicePath \"\"" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.072832 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" event={"ID":"5d404546-874a-474a-ac90-b6be34ed0420","Type":"ContainerDied","Data":"a4c74ae6eb56204e81a55ad899b7311baab9c5b65a348ca0d991f52501352b46"} Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.072884 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4c74ae6eb56204e81a55ad899b7311baab9c5b65a348ca0d991f52501352b46" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.072902 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.188516 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864"] Jan 21 16:02:30 crc kubenswrapper[4773]: E0121 16:02:30.188945 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d404546-874a-474a-ac90-b6be34ed0420" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.188962 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d404546-874a-474a-ac90-b6be34ed0420" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.189193 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d404546-874a-474a-ac90-b6be34ed0420" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.189955 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.192194 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.192874 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.193007 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.193064 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.193080 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.193124 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.193187 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.193470 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.207343 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864"] Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.216623 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.216681 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.216738 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.216814 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.216866 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.217037 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.217083 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.217186 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.217219 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.217319 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qprcn\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-kube-api-access-qprcn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.217361 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.217424 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.217530 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.217568 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.335916 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.335978 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336040 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336064 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336116 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qprcn\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-kube-api-access-qprcn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336146 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336172 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336217 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336241 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336279 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336307 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336331 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336381 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.336417 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.340709 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.360105 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.360442 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.366313 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.367462 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.369342 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.372589 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.373124 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.373850 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.374449 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qprcn\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-kube-api-access-qprcn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.377374 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.377549 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.377783 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.377975 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gn864\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:30 crc kubenswrapper[4773]: I0121 16:02:30.508855 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:02:31 crc kubenswrapper[4773]: I0121 16:02:31.053850 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864"] Jan 21 16:02:31 crc kubenswrapper[4773]: I0121 16:02:31.089287 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" event={"ID":"1d844687-a8ab-4fab-8b3d-fb3210db5d86","Type":"ContainerStarted","Data":"e80320eaf6c94fcf02eb2ff00544477c55706fe2a7b4fd5629ad6f8b1b16b193"} Jan 21 16:02:32 crc kubenswrapper[4773]: I0121 16:02:32.100425 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" event={"ID":"1d844687-a8ab-4fab-8b3d-fb3210db5d86","Type":"ContainerStarted","Data":"9fcc968bef055799a5a9bd8908bbd448f9ac86f6721b384df0c802e5659e7383"} Jan 21 16:02:32 crc kubenswrapper[4773]: I0121 16:02:32.127029 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" podStartSLOduration=1.404816144 podStartE2EDuration="2.127003372s" podCreationTimestamp="2026-01-21 16:02:30 +0000 UTC" firstStartedPulling="2026-01-21 16:02:31.055752349 +0000 UTC m=+2315.980241971" lastFinishedPulling="2026-01-21 16:02:31.777939577 +0000 UTC m=+2316.702429199" observedRunningTime="2026-01-21 16:02:32.117642577 +0000 UTC m=+2317.042132219" watchObservedRunningTime="2026-01-21 16:02:32.127003372 +0000 UTC m=+2317.051492994" Jan 21 16:02:39 crc kubenswrapper[4773]: I0121 16:02:39.384070 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:02:39 crc kubenswrapper[4773]: E0121 16:02:39.385357 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:02:52 crc kubenswrapper[4773]: I0121 16:02:52.383828 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:02:52 crc kubenswrapper[4773]: E0121 16:02:52.384726 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:03:05 crc kubenswrapper[4773]: I0121 16:03:05.395187 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:03:05 crc kubenswrapper[4773]: E0121 16:03:05.411121 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:03:11 crc kubenswrapper[4773]: I0121 16:03:11.785309 4773 generic.go:334] "Generic (PLEG): container finished" podID="1d844687-a8ab-4fab-8b3d-fb3210db5d86" containerID="9fcc968bef055799a5a9bd8908bbd448f9ac86f6721b384df0c802e5659e7383" exitCode=0 Jan 21 16:03:11 crc kubenswrapper[4773]: I0121 16:03:11.785408 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" event={"ID":"1d844687-a8ab-4fab-8b3d-fb3210db5d86","Type":"ContainerDied","Data":"9fcc968bef055799a5a9bd8908bbd448f9ac86f6721b384df0c802e5659e7383"} Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.317543 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.483827 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-repo-setup-combined-ca-bundle\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.483938 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-ovn-default-certs-0\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.484989 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.485044 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-neutron-metadata-combined-ca-bundle\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.485108 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-libvirt-combined-ca-bundle\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.485144 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-nova-combined-ca-bundle\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.485238 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-telemetry-combined-ca-bundle\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.485287 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qprcn\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-kube-api-access-qprcn\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.485340 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ovn-combined-ca-bundle\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.485374 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ssh-key-openstack-edpm-ipam\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.485400 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.485485 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.485556 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-inventory\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.485582 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-bootstrap-combined-ca-bundle\") pod \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\" (UID: \"1d844687-a8ab-4fab-8b3d-fb3210db5d86\") " Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.490852 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.490924 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.491995 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.492063 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.492540 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.492758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.493156 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.494182 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-kube-api-access-qprcn" (OuterVolumeSpecName: "kube-api-access-qprcn") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "kube-api-access-qprcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.496734 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.496888 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.500376 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.504759 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.529576 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-inventory" (OuterVolumeSpecName: "inventory") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.529843 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1d844687-a8ab-4fab-8b3d-fb3210db5d86" (UID: "1d844687-a8ab-4fab-8b3d-fb3210db5d86"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589041 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589085 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589101 4773 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589114 4773 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589128 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589141 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589154 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589167 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589178 4773 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589189 4773 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589199 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qprcn\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-kube-api-access-qprcn\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589211 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589223 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d844687-a8ab-4fab-8b3d-fb3210db5d86-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.589235 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1d844687-a8ab-4fab-8b3d-fb3210db5d86-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.805219 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" event={"ID":"1d844687-a8ab-4fab-8b3d-fb3210db5d86","Type":"ContainerDied","Data":"e80320eaf6c94fcf02eb2ff00544477c55706fe2a7b4fd5629ad6f8b1b16b193"} Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.805285 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e80320eaf6c94fcf02eb2ff00544477c55706fe2a7b4fd5629ad6f8b1b16b193" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.805296 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gn864" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.901821 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p"] Jan 21 16:03:13 crc kubenswrapper[4773]: E0121 16:03:13.902244 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d844687-a8ab-4fab-8b3d-fb3210db5d86" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.902265 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d844687-a8ab-4fab-8b3d-fb3210db5d86" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.902481 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d844687-a8ab-4fab-8b3d-fb3210db5d86" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.903474 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.906591 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.907726 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.908630 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.908882 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.910002 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.914398 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p"] Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.998172 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.998425 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.998595 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bnrl\" (UniqueName: \"kubernetes.io/projected/1ab3d038-3af9-4719-872d-fc431de9959b-kube-api-access-8bnrl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.998682 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:13 crc kubenswrapper[4773]: I0121 16:03:13.998779 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1ab3d038-3af9-4719-872d-fc431de9959b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.100890 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bnrl\" (UniqueName: \"kubernetes.io/projected/1ab3d038-3af9-4719-872d-fc431de9959b-kube-api-access-8bnrl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.100978 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.101031 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1ab3d038-3af9-4719-872d-fc431de9959b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.101149 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.101233 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.102452 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1ab3d038-3af9-4719-872d-fc431de9959b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.105745 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.106491 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.109423 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.119321 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bnrl\" (UniqueName: \"kubernetes.io/projected/1ab3d038-3af9-4719-872d-fc431de9959b-kube-api-access-8bnrl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-htv6p\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.219193 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.764217 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p"] Jan 21 16:03:14 crc kubenswrapper[4773]: I0121 16:03:14.815734 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" event={"ID":"1ab3d038-3af9-4719-872d-fc431de9959b","Type":"ContainerStarted","Data":"ce856b63a7f8cb4824ca0090bebc21a212490c06c4c1b5908ca7b37cd4428c60"} Jan 21 16:03:16 crc kubenswrapper[4773]: I0121 16:03:16.844619 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" event={"ID":"1ab3d038-3af9-4719-872d-fc431de9959b","Type":"ContainerStarted","Data":"d4f456856429c275a3115a20b0c78d7b73e1e47ec972ad963b6c831ff8802de3"} Jan 21 16:03:16 crc kubenswrapper[4773]: I0121 16:03:16.860491 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" podStartSLOduration=3.286730995 podStartE2EDuration="3.860472407s" podCreationTimestamp="2026-01-21 16:03:13 +0000 UTC" firstStartedPulling="2026-01-21 16:03:14.770154657 +0000 UTC m=+2359.694644279" lastFinishedPulling="2026-01-21 16:03:15.343896049 +0000 UTC m=+2360.268385691" observedRunningTime="2026-01-21 16:03:16.860120477 +0000 UTC m=+2361.784610129" watchObservedRunningTime="2026-01-21 16:03:16.860472407 +0000 UTC m=+2361.784962029" Jan 21 16:03:20 crc kubenswrapper[4773]: I0121 16:03:20.384365 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:03:20 crc kubenswrapper[4773]: E0121 16:03:20.385193 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:03:35 crc kubenswrapper[4773]: I0121 16:03:35.389893 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:03:35 crc kubenswrapper[4773]: E0121 16:03:35.390583 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:03:46 crc kubenswrapper[4773]: I0121 16:03:46.383841 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:03:46 crc kubenswrapper[4773]: E0121 16:03:46.384643 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:04:00 crc kubenswrapper[4773]: I0121 16:04:00.384485 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:04:00 crc kubenswrapper[4773]: E0121 16:04:00.386587 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:04:14 crc kubenswrapper[4773]: I0121 16:04:14.384152 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:04:14 crc kubenswrapper[4773]: E0121 16:04:14.385013 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:04:19 crc kubenswrapper[4773]: I0121 16:04:19.453031 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ab3d038-3af9-4719-872d-fc431de9959b" containerID="d4f456856429c275a3115a20b0c78d7b73e1e47ec972ad963b6c831ff8802de3" exitCode=0 Jan 21 16:04:19 crc kubenswrapper[4773]: I0121 16:04:19.453226 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" event={"ID":"1ab3d038-3af9-4719-872d-fc431de9959b","Type":"ContainerDied","Data":"d4f456856429c275a3115a20b0c78d7b73e1e47ec972ad963b6c831ff8802de3"} Jan 21 16:04:20 crc kubenswrapper[4773]: I0121 16:04:20.948389 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.092589 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ssh-key-openstack-edpm-ipam\") pod \"1ab3d038-3af9-4719-872d-fc431de9959b\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.092653 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bnrl\" (UniqueName: \"kubernetes.io/projected/1ab3d038-3af9-4719-872d-fc431de9959b-kube-api-access-8bnrl\") pod \"1ab3d038-3af9-4719-872d-fc431de9959b\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.093684 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-inventory\") pod \"1ab3d038-3af9-4719-872d-fc431de9959b\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.093831 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ovn-combined-ca-bundle\") pod \"1ab3d038-3af9-4719-872d-fc431de9959b\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.093855 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1ab3d038-3af9-4719-872d-fc431de9959b-ovncontroller-config-0\") pod \"1ab3d038-3af9-4719-872d-fc431de9959b\" (UID: \"1ab3d038-3af9-4719-872d-fc431de9959b\") " Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.099882 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1ab3d038-3af9-4719-872d-fc431de9959b" (UID: "1ab3d038-3af9-4719-872d-fc431de9959b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.100218 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab3d038-3af9-4719-872d-fc431de9959b-kube-api-access-8bnrl" (OuterVolumeSpecName: "kube-api-access-8bnrl") pod "1ab3d038-3af9-4719-872d-fc431de9959b" (UID: "1ab3d038-3af9-4719-872d-fc431de9959b"). InnerVolumeSpecName "kube-api-access-8bnrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.122125 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1ab3d038-3af9-4719-872d-fc431de9959b" (UID: "1ab3d038-3af9-4719-872d-fc431de9959b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.126270 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-inventory" (OuterVolumeSpecName: "inventory") pod "1ab3d038-3af9-4719-872d-fc431de9959b" (UID: "1ab3d038-3af9-4719-872d-fc431de9959b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.129422 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab3d038-3af9-4719-872d-fc431de9959b-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "1ab3d038-3af9-4719-872d-fc431de9959b" (UID: "1ab3d038-3af9-4719-872d-fc431de9959b"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.195810 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.195844 4773 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1ab3d038-3af9-4719-872d-fc431de9959b-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.195856 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.195868 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bnrl\" (UniqueName: \"kubernetes.io/projected/1ab3d038-3af9-4719-872d-fc431de9959b-kube-api-access-8bnrl\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.195880 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ab3d038-3af9-4719-872d-fc431de9959b-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.475334 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" event={"ID":"1ab3d038-3af9-4719-872d-fc431de9959b","Type":"ContainerDied","Data":"ce856b63a7f8cb4824ca0090bebc21a212490c06c4c1b5908ca7b37cd4428c60"} Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.475381 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce856b63a7f8cb4824ca0090bebc21a212490c06c4c1b5908ca7b37cd4428c60" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.475583 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-htv6p" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.669318 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n"] Jan 21 16:04:21 crc kubenswrapper[4773]: E0121 16:04:21.674578 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab3d038-3af9-4719-872d-fc431de9959b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.674628 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab3d038-3af9-4719-872d-fc431de9959b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.675673 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab3d038-3af9-4719-872d-fc431de9959b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.683430 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.686392 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.689060 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.689130 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.689334 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.689435 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.689620 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.695504 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n"] Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.708328 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.708410 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.708457 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.708534 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.708590 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxg79\" (UniqueName: \"kubernetes.io/projected/60b51255-6cb0-404b-9431-a04ded467081-kube-api-access-xxg79\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.708616 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.810333 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.810407 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.810461 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.810559 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.810633 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxg79\" (UniqueName: \"kubernetes.io/projected/60b51255-6cb0-404b-9431-a04ded467081-kube-api-access-xxg79\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.810665 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.815126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.815128 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.815167 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.815425 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.817242 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:21 crc kubenswrapper[4773]: I0121 16:04:21.828414 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxg79\" (UniqueName: \"kubernetes.io/projected/60b51255-6cb0-404b-9431-a04ded467081-kube-api-access-xxg79\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:22 crc kubenswrapper[4773]: I0121 16:04:22.012248 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:04:22 crc kubenswrapper[4773]: I0121 16:04:22.562021 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n"] Jan 21 16:04:23 crc kubenswrapper[4773]: I0121 16:04:23.496428 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" event={"ID":"60b51255-6cb0-404b-9431-a04ded467081","Type":"ContainerStarted","Data":"acb57ce8de0513429ce46629e7c4f3e980b2c797338bc530cf04266125f00c28"} Jan 21 16:04:25 crc kubenswrapper[4773]: I0121 16:04:25.516605 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" event={"ID":"60b51255-6cb0-404b-9431-a04ded467081","Type":"ContainerStarted","Data":"de30c6dc6786c80e37d2e5a998bde9bf986e9c6ea412c1cc5c8cbae83bdccf24"} Jan 21 16:04:25 crc kubenswrapper[4773]: I0121 16:04:25.560518 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" podStartSLOduration=2.590759962 podStartE2EDuration="4.560498439s" podCreationTimestamp="2026-01-21 16:04:21 +0000 UTC" firstStartedPulling="2026-01-21 16:04:22.570358228 +0000 UTC m=+2427.494847850" lastFinishedPulling="2026-01-21 16:04:24.540096705 +0000 UTC m=+2429.464586327" observedRunningTime="2026-01-21 16:04:25.558098833 +0000 UTC m=+2430.482588455" watchObservedRunningTime="2026-01-21 16:04:25.560498439 +0000 UTC m=+2430.484988061" Jan 21 16:04:27 crc kubenswrapper[4773]: I0121 16:04:27.383646 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:04:27 crc kubenswrapper[4773]: E0121 16:04:27.384304 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:04:38 crc kubenswrapper[4773]: I0121 16:04:38.384487 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:04:38 crc kubenswrapper[4773]: E0121 16:04:38.385536 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:04:50 crc kubenswrapper[4773]: I0121 16:04:50.384321 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:04:50 crc kubenswrapper[4773]: E0121 16:04:50.385298 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:05:03 crc kubenswrapper[4773]: I0121 16:05:03.384934 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:05:03 crc kubenswrapper[4773]: E0121 16:05:03.387324 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:05:15 crc kubenswrapper[4773]: I0121 16:05:15.390758 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:05:15 crc kubenswrapper[4773]: E0121 16:05:15.391572 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:05:16 crc kubenswrapper[4773]: I0121 16:05:16.044670 4773 generic.go:334] "Generic (PLEG): container finished" podID="60b51255-6cb0-404b-9431-a04ded467081" containerID="de30c6dc6786c80e37d2e5a998bde9bf986e9c6ea412c1cc5c8cbae83bdccf24" exitCode=0 Jan 21 16:05:16 crc kubenswrapper[4773]: I0121 16:05:16.044773 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" event={"ID":"60b51255-6cb0-404b-9431-a04ded467081","Type":"ContainerDied","Data":"de30c6dc6786c80e37d2e5a998bde9bf986e9c6ea412c1cc5c8cbae83bdccf24"} Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.593669 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.750775 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-inventory\") pod \"60b51255-6cb0-404b-9431-a04ded467081\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.751256 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-ssh-key-openstack-edpm-ipam\") pod \"60b51255-6cb0-404b-9431-a04ded467081\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.751353 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxg79\" (UniqueName: \"kubernetes.io/projected/60b51255-6cb0-404b-9431-a04ded467081-kube-api-access-xxg79\") pod \"60b51255-6cb0-404b-9431-a04ded467081\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.751397 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-ovn-metadata-agent-neutron-config-0\") pod \"60b51255-6cb0-404b-9431-a04ded467081\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.751459 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-metadata-combined-ca-bundle\") pod \"60b51255-6cb0-404b-9431-a04ded467081\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.751511 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-nova-metadata-neutron-config-0\") pod \"60b51255-6cb0-404b-9431-a04ded467081\" (UID: \"60b51255-6cb0-404b-9431-a04ded467081\") " Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.770901 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "60b51255-6cb0-404b-9431-a04ded467081" (UID: "60b51255-6cb0-404b-9431-a04ded467081"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.775256 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b51255-6cb0-404b-9431-a04ded467081-kube-api-access-xxg79" (OuterVolumeSpecName: "kube-api-access-xxg79") pod "60b51255-6cb0-404b-9431-a04ded467081" (UID: "60b51255-6cb0-404b-9431-a04ded467081"). InnerVolumeSpecName "kube-api-access-xxg79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.807961 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "60b51255-6cb0-404b-9431-a04ded467081" (UID: "60b51255-6cb0-404b-9431-a04ded467081"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.815290 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-inventory" (OuterVolumeSpecName: "inventory") pod "60b51255-6cb0-404b-9431-a04ded467081" (UID: "60b51255-6cb0-404b-9431-a04ded467081"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.853594 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.853628 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.853641 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.853649 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxg79\" (UniqueName: \"kubernetes.io/projected/60b51255-6cb0-404b-9431-a04ded467081-kube-api-access-xxg79\") on node \"crc\" DevicePath \"\"" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.867754 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "60b51255-6cb0-404b-9431-a04ded467081" (UID: "60b51255-6cb0-404b-9431-a04ded467081"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.881980 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "60b51255-6cb0-404b-9431-a04ded467081" (UID: "60b51255-6cb0-404b-9431-a04ded467081"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.955842 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:05:17 crc kubenswrapper[4773]: I0121 16:05:17.955898 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60b51255-6cb0-404b-9431-a04ded467081-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.067406 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" event={"ID":"60b51255-6cb0-404b-9431-a04ded467081","Type":"ContainerDied","Data":"acb57ce8de0513429ce46629e7c4f3e980b2c797338bc530cf04266125f00c28"} Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.067468 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acb57ce8de0513429ce46629e7c4f3e980b2c797338bc530cf04266125f00c28" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.067546 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.165773 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl"] Jan 21 16:05:18 crc kubenswrapper[4773]: E0121 16:05:18.166328 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b51255-6cb0-404b-9431-a04ded467081" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.166352 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b51255-6cb0-404b-9431-a04ded467081" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.166587 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b51255-6cb0-404b-9431-a04ded467081" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.167566 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.170329 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.170645 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.170986 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.171179 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.171939 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.177623 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl"] Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.261952 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlddd\" (UniqueName: \"kubernetes.io/projected/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-kube-api-access-jlddd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.262050 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.262430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.262575 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.262652 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.365393 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlddd\" (UniqueName: \"kubernetes.io/projected/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-kube-api-access-jlddd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.365842 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.366033 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.366186 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.366313 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.370364 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.370431 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.370467 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.370682 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.386878 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlddd\" (UniqueName: \"kubernetes.io/projected/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-kube-api-access-jlddd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:18 crc kubenswrapper[4773]: I0121 16:05:18.485736 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:05:19 crc kubenswrapper[4773]: I0121 16:05:19.111212 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl"] Jan 21 16:05:20 crc kubenswrapper[4773]: I0121 16:05:20.090141 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" event={"ID":"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd","Type":"ContainerStarted","Data":"1999d4c1404ba3f4e9cf68e9ee3edc03e4c7cade743bc560951de982f376c3f8"} Jan 21 16:05:21 crc kubenswrapper[4773]: I0121 16:05:21.111229 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" event={"ID":"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd","Type":"ContainerStarted","Data":"2790ebfb596ad73f9b7d09ee55affeb5e511eef97224dea2b5091d3144faa754"} Jan 21 16:05:21 crc kubenswrapper[4773]: I0121 16:05:21.138138 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" podStartSLOduration=1.963780872 podStartE2EDuration="3.138119911s" podCreationTimestamp="2026-01-21 16:05:18 +0000 UTC" firstStartedPulling="2026-01-21 16:05:19.115454358 +0000 UTC m=+2484.039943980" lastFinishedPulling="2026-01-21 16:05:20.289793387 +0000 UTC m=+2485.214283019" observedRunningTime="2026-01-21 16:05:21.129623909 +0000 UTC m=+2486.054113531" watchObservedRunningTime="2026-01-21 16:05:21.138119911 +0000 UTC m=+2486.062609533" Jan 21 16:05:27 crc kubenswrapper[4773]: I0121 16:05:27.384289 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:05:27 crc kubenswrapper[4773]: E0121 16:05:27.385175 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:05:38 crc kubenswrapper[4773]: I0121 16:05:38.384140 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:05:38 crc kubenswrapper[4773]: E0121 16:05:38.385086 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:05:49 crc kubenswrapper[4773]: I0121 16:05:49.384294 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:05:49 crc kubenswrapper[4773]: E0121 16:05:49.385312 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:06:02 crc kubenswrapper[4773]: I0121 16:06:02.384508 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:06:02 crc kubenswrapper[4773]: E0121 16:06:02.386197 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:06:13 crc kubenswrapper[4773]: I0121 16:06:13.384018 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:06:13 crc kubenswrapper[4773]: E0121 16:06:13.384962 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:06:24 crc kubenswrapper[4773]: I0121 16:06:24.383954 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:06:24 crc kubenswrapper[4773]: E0121 16:06:24.384879 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:06:36 crc kubenswrapper[4773]: I0121 16:06:36.384525 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:06:36 crc kubenswrapper[4773]: E0121 16:06:36.385802 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:06:48 crc kubenswrapper[4773]: I0121 16:06:48.384498 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:06:48 crc kubenswrapper[4773]: E0121 16:06:48.385492 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:07:03 crc kubenswrapper[4773]: I0121 16:07:03.383813 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:07:03 crc kubenswrapper[4773]: E0121 16:07:03.384538 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:07:15 crc kubenswrapper[4773]: I0121 16:07:15.393409 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:07:15 crc kubenswrapper[4773]: E0121 16:07:15.394406 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:07:22 crc kubenswrapper[4773]: I0121 16:07:22.153374 4773 scope.go:117] "RemoveContainer" containerID="831429d8ba5f58d2ac48cfb5ebf75d78820042b23709a9ce3c15cb66a84ef83b" Jan 21 16:07:22 crc kubenswrapper[4773]: I0121 16:07:22.187392 4773 scope.go:117] "RemoveContainer" containerID="a96830c86a1cc8e2693c99b4233ca1254d595026642b844bf11ce89e0c0cf4de" Jan 21 16:07:22 crc kubenswrapper[4773]: I0121 16:07:22.210635 4773 scope.go:117] "RemoveContainer" containerID="fcb56e1e577b9a29d4104b4cc811116ffb3f711c5be759c36423352dc752695f" Jan 21 16:07:27 crc kubenswrapper[4773]: I0121 16:07:27.386722 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:07:28 crc kubenswrapper[4773]: I0121 16:07:28.393476 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"13c066ce10a9b3578dd357e22edb08641875832f497ce277dbd8bef0aec27aa8"} Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.289667 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8d4pc"] Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.295650 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.303985 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8d4pc"] Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.386910 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-catalog-content\") pod \"redhat-operators-8d4pc\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.387013 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-utilities\") pod \"redhat-operators-8d4pc\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.387295 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z5pm\" (UniqueName: \"kubernetes.io/projected/e525612b-8114-4df5-ad3e-dfc99519175a-kube-api-access-2z5pm\") pod \"redhat-operators-8d4pc\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.489097 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-utilities\") pod \"redhat-operators-8d4pc\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.489196 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z5pm\" (UniqueName: \"kubernetes.io/projected/e525612b-8114-4df5-ad3e-dfc99519175a-kube-api-access-2z5pm\") pod \"redhat-operators-8d4pc\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.489450 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-catalog-content\") pod \"redhat-operators-8d4pc\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.489967 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-catalog-content\") pod \"redhat-operators-8d4pc\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.490510 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-utilities\") pod \"redhat-operators-8d4pc\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.511361 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z5pm\" (UniqueName: \"kubernetes.io/projected/e525612b-8114-4df5-ad3e-dfc99519175a-kube-api-access-2z5pm\") pod \"redhat-operators-8d4pc\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:07:57 crc kubenswrapper[4773]: I0121 16:07:57.621910 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:07:58 crc kubenswrapper[4773]: I0121 16:07:58.147080 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8d4pc"] Jan 21 16:07:58 crc kubenswrapper[4773]: I0121 16:07:58.699086 4773 generic.go:334] "Generic (PLEG): container finished" podID="e525612b-8114-4df5-ad3e-dfc99519175a" containerID="b632939a0725bc5167b0cc5016cedf4c1fb645d6781df67acecfc9cacf46e978" exitCode=0 Jan 21 16:07:58 crc kubenswrapper[4773]: I0121 16:07:58.699176 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d4pc" event={"ID":"e525612b-8114-4df5-ad3e-dfc99519175a","Type":"ContainerDied","Data":"b632939a0725bc5167b0cc5016cedf4c1fb645d6781df67acecfc9cacf46e978"} Jan 21 16:07:58 crc kubenswrapper[4773]: I0121 16:07:58.699460 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d4pc" event={"ID":"e525612b-8114-4df5-ad3e-dfc99519175a","Type":"ContainerStarted","Data":"221012b30d108857d135c1364f905f3618e1aebb564e2327abea47602f2e0d58"} Jan 21 16:07:58 crc kubenswrapper[4773]: I0121 16:07:58.701857 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:08:00 crc kubenswrapper[4773]: I0121 16:08:00.720735 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d4pc" event={"ID":"e525612b-8114-4df5-ad3e-dfc99519175a","Type":"ContainerStarted","Data":"a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f"} Jan 21 16:08:06 crc kubenswrapper[4773]: I0121 16:08:06.782590 4773 generic.go:334] "Generic (PLEG): container finished" podID="e525612b-8114-4df5-ad3e-dfc99519175a" containerID="a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f" exitCode=0 Jan 21 16:08:06 crc kubenswrapper[4773]: I0121 16:08:06.782744 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d4pc" event={"ID":"e525612b-8114-4df5-ad3e-dfc99519175a","Type":"ContainerDied","Data":"a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f"} Jan 21 16:08:09 crc kubenswrapper[4773]: I0121 16:08:09.816121 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d4pc" event={"ID":"e525612b-8114-4df5-ad3e-dfc99519175a","Type":"ContainerStarted","Data":"959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e"} Jan 21 16:08:09 crc kubenswrapper[4773]: I0121 16:08:09.845162 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8d4pc" podStartSLOduration=2.7511928230000002 podStartE2EDuration="12.845140405s" podCreationTimestamp="2026-01-21 16:07:57 +0000 UTC" firstStartedPulling="2026-01-21 16:07:58.70156466 +0000 UTC m=+2643.626054272" lastFinishedPulling="2026-01-21 16:08:08.795512222 +0000 UTC m=+2653.720001854" observedRunningTime="2026-01-21 16:08:09.843225602 +0000 UTC m=+2654.767715234" watchObservedRunningTime="2026-01-21 16:08:09.845140405 +0000 UTC m=+2654.769630017" Jan 21 16:08:17 crc kubenswrapper[4773]: I0121 16:08:17.622869 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:08:17 crc kubenswrapper[4773]: I0121 16:08:17.623397 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:08:17 crc kubenswrapper[4773]: I0121 16:08:17.674129 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:08:17 crc kubenswrapper[4773]: I0121 16:08:17.962052 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:08:18 crc kubenswrapper[4773]: I0121 16:08:18.011335 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8d4pc"] Jan 21 16:08:19 crc kubenswrapper[4773]: I0121 16:08:19.924955 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8d4pc" podUID="e525612b-8114-4df5-ad3e-dfc99519175a" containerName="registry-server" containerID="cri-o://959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e" gracePeriod=2 Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.535566 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.711005 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-catalog-content\") pod \"e525612b-8114-4df5-ad3e-dfc99519175a\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.711754 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-utilities\") pod \"e525612b-8114-4df5-ad3e-dfc99519175a\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.711926 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z5pm\" (UniqueName: \"kubernetes.io/projected/e525612b-8114-4df5-ad3e-dfc99519175a-kube-api-access-2z5pm\") pod \"e525612b-8114-4df5-ad3e-dfc99519175a\" (UID: \"e525612b-8114-4df5-ad3e-dfc99519175a\") " Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.713787 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-utilities" (OuterVolumeSpecName: "utilities") pod "e525612b-8114-4df5-ad3e-dfc99519175a" (UID: "e525612b-8114-4df5-ad3e-dfc99519175a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.720797 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e525612b-8114-4df5-ad3e-dfc99519175a-kube-api-access-2z5pm" (OuterVolumeSpecName: "kube-api-access-2z5pm") pod "e525612b-8114-4df5-ad3e-dfc99519175a" (UID: "e525612b-8114-4df5-ad3e-dfc99519175a"). InnerVolumeSpecName "kube-api-access-2z5pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.814593 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z5pm\" (UniqueName: \"kubernetes.io/projected/e525612b-8114-4df5-ad3e-dfc99519175a-kube-api-access-2z5pm\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.814629 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.834845 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e525612b-8114-4df5-ad3e-dfc99519175a" (UID: "e525612b-8114-4df5-ad3e-dfc99519175a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.916312 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e525612b-8114-4df5-ad3e-dfc99519175a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.938103 4773 generic.go:334] "Generic (PLEG): container finished" podID="e525612b-8114-4df5-ad3e-dfc99519175a" containerID="959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e" exitCode=0 Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.938176 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d4pc" event={"ID":"e525612b-8114-4df5-ad3e-dfc99519175a","Type":"ContainerDied","Data":"959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e"} Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.938209 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d4pc" event={"ID":"e525612b-8114-4df5-ad3e-dfc99519175a","Type":"ContainerDied","Data":"221012b30d108857d135c1364f905f3618e1aebb564e2327abea47602f2e0d58"} Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.938231 4773 scope.go:117] "RemoveContainer" containerID="959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e" Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.938405 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8d4pc" Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.962771 4773 scope.go:117] "RemoveContainer" containerID="a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f" Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.980326 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8d4pc"] Jan 21 16:08:20 crc kubenswrapper[4773]: I0121 16:08:20.990533 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8d4pc"] Jan 21 16:08:21 crc kubenswrapper[4773]: I0121 16:08:21.002938 4773 scope.go:117] "RemoveContainer" containerID="b632939a0725bc5167b0cc5016cedf4c1fb645d6781df67acecfc9cacf46e978" Jan 21 16:08:21 crc kubenswrapper[4773]: I0121 16:08:21.038562 4773 scope.go:117] "RemoveContainer" containerID="959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e" Jan 21 16:08:21 crc kubenswrapper[4773]: E0121 16:08:21.039737 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e\": container with ID starting with 959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e not found: ID does not exist" containerID="959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e" Jan 21 16:08:21 crc kubenswrapper[4773]: I0121 16:08:21.039780 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e"} err="failed to get container status \"959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e\": rpc error: code = NotFound desc = could not find container \"959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e\": container with ID starting with 959a44216f8cc225f639937989643dc4c37b1a24ad1990eed55450f28c6a034e not found: ID does not exist" Jan 21 16:08:21 crc kubenswrapper[4773]: I0121 16:08:21.039806 4773 scope.go:117] "RemoveContainer" containerID="a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f" Jan 21 16:08:21 crc kubenswrapper[4773]: E0121 16:08:21.040607 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f\": container with ID starting with a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f not found: ID does not exist" containerID="a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f" Jan 21 16:08:21 crc kubenswrapper[4773]: I0121 16:08:21.040637 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f"} err="failed to get container status \"a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f\": rpc error: code = NotFound desc = could not find container \"a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f\": container with ID starting with a626adfa86d88118c966139967181615ce3bdf9e8b80b97d9add0189c6b1a00f not found: ID does not exist" Jan 21 16:08:21 crc kubenswrapper[4773]: I0121 16:08:21.040656 4773 scope.go:117] "RemoveContainer" containerID="b632939a0725bc5167b0cc5016cedf4c1fb645d6781df67acecfc9cacf46e978" Jan 21 16:08:21 crc kubenswrapper[4773]: E0121 16:08:21.041156 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b632939a0725bc5167b0cc5016cedf4c1fb645d6781df67acecfc9cacf46e978\": container with ID starting with b632939a0725bc5167b0cc5016cedf4c1fb645d6781df67acecfc9cacf46e978 not found: ID does not exist" containerID="b632939a0725bc5167b0cc5016cedf4c1fb645d6781df67acecfc9cacf46e978" Jan 21 16:08:21 crc kubenswrapper[4773]: I0121 16:08:21.041239 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b632939a0725bc5167b0cc5016cedf4c1fb645d6781df67acecfc9cacf46e978"} err="failed to get container status \"b632939a0725bc5167b0cc5016cedf4c1fb645d6781df67acecfc9cacf46e978\": rpc error: code = NotFound desc = could not find container \"b632939a0725bc5167b0cc5016cedf4c1fb645d6781df67acecfc9cacf46e978\": container with ID starting with b632939a0725bc5167b0cc5016cedf4c1fb645d6781df67acecfc9cacf46e978 not found: ID does not exist" Jan 21 16:08:21 crc kubenswrapper[4773]: I0121 16:08:21.396675 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e525612b-8114-4df5-ad3e-dfc99519175a" path="/var/lib/kubelet/pods/e525612b-8114-4df5-ad3e-dfc99519175a/volumes" Jan 21 16:09:55 crc kubenswrapper[4773]: I0121 16:09:55.205959 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:09:55 crc kubenswrapper[4773]: I0121 16:09:55.206574 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:10:01 crc kubenswrapper[4773]: I0121 16:10:01.860750 4773 generic.go:334] "Generic (PLEG): container finished" podID="c6a589bb-2c6a-48e3-80bd-daa3599ba7fd" containerID="2790ebfb596ad73f9b7d09ee55affeb5e511eef97224dea2b5091d3144faa754" exitCode=0 Jan 21 16:10:01 crc kubenswrapper[4773]: I0121 16:10:01.860834 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" event={"ID":"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd","Type":"ContainerDied","Data":"2790ebfb596ad73f9b7d09ee55affeb5e511eef97224dea2b5091d3144faa754"} Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.549917 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.629472 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-ssh-key-openstack-edpm-ipam\") pod \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.629655 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-combined-ca-bundle\") pod \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.629794 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-inventory\") pod \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.629836 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-secret-0\") pod \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.629968 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlddd\" (UniqueName: \"kubernetes.io/projected/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-kube-api-access-jlddd\") pod \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\" (UID: \"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd\") " Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.636391 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c6a589bb-2c6a-48e3-80bd-daa3599ba7fd" (UID: "c6a589bb-2c6a-48e3-80bd-daa3599ba7fd"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.636817 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-kube-api-access-jlddd" (OuterVolumeSpecName: "kube-api-access-jlddd") pod "c6a589bb-2c6a-48e3-80bd-daa3599ba7fd" (UID: "c6a589bb-2c6a-48e3-80bd-daa3599ba7fd"). InnerVolumeSpecName "kube-api-access-jlddd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.664412 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c6a589bb-2c6a-48e3-80bd-daa3599ba7fd" (UID: "c6a589bb-2c6a-48e3-80bd-daa3599ba7fd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.665139 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c6a589bb-2c6a-48e3-80bd-daa3599ba7fd" (UID: "c6a589bb-2c6a-48e3-80bd-daa3599ba7fd"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.665565 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-inventory" (OuterVolumeSpecName: "inventory") pod "c6a589bb-2c6a-48e3-80bd-daa3599ba7fd" (UID: "c6a589bb-2c6a-48e3-80bd-daa3599ba7fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.733591 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlddd\" (UniqueName: \"kubernetes.io/projected/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-kube-api-access-jlddd\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.733624 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.733633 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.733643 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.733652 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c6a589bb-2c6a-48e3-80bd-daa3599ba7fd-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.880559 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" event={"ID":"c6a589bb-2c6a-48e3-80bd-daa3599ba7fd","Type":"ContainerDied","Data":"1999d4c1404ba3f4e9cf68e9ee3edc03e4c7cade743bc560951de982f376c3f8"} Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.880597 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1999d4c1404ba3f4e9cf68e9ee3edc03e4c7cade743bc560951de982f376c3f8" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.880608 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.978116 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt"] Jan 21 16:10:03 crc kubenswrapper[4773]: E0121 16:10:03.978684 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e525612b-8114-4df5-ad3e-dfc99519175a" containerName="registry-server" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.978727 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e525612b-8114-4df5-ad3e-dfc99519175a" containerName="registry-server" Jan 21 16:10:03 crc kubenswrapper[4773]: E0121 16:10:03.978755 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a589bb-2c6a-48e3-80bd-daa3599ba7fd" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.978765 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a589bb-2c6a-48e3-80bd-daa3599ba7fd" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 21 16:10:03 crc kubenswrapper[4773]: E0121 16:10:03.978781 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e525612b-8114-4df5-ad3e-dfc99519175a" containerName="extract-content" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.978789 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e525612b-8114-4df5-ad3e-dfc99519175a" containerName="extract-content" Jan 21 16:10:03 crc kubenswrapper[4773]: E0121 16:10:03.978806 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e525612b-8114-4df5-ad3e-dfc99519175a" containerName="extract-utilities" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.978812 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e525612b-8114-4df5-ad3e-dfc99519175a" containerName="extract-utilities" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.979039 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e525612b-8114-4df5-ad3e-dfc99519175a" containerName="registry-server" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.979060 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a589bb-2c6a-48e3-80bd-daa3599ba7fd" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.979825 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.984160 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.984362 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.984067 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.984487 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.985149 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.985246 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.986216 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 16:10:03 crc kubenswrapper[4773]: I0121 16:10:03.991826 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt"] Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.038302 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.038378 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.038450 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.038556 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.038593 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.038638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdcdq\" (UniqueName: \"kubernetes.io/projected/88d600dd-1b0b-4e33-a91f-4375318fdc5f-kube-api-access-fdcdq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.038674 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.038735 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.038822 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.140859 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.140898 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.140925 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.140973 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.141033 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdcdq\" (UniqueName: \"kubernetes.io/projected/88d600dd-1b0b-4e33-a91f-4375318fdc5f-kube-api-access-fdcdq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.141201 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.141651 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.141741 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.141793 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.141822 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.144849 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.144871 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.144979 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.145635 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.146579 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.146941 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.148422 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.167466 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdcdq\" (UniqueName: \"kubernetes.io/projected/88d600dd-1b0b-4e33-a91f-4375318fdc5f-kube-api-access-fdcdq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cvdbt\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.299828 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.856555 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt"] Jan 21 16:10:04 crc kubenswrapper[4773]: I0121 16:10:04.890940 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" event={"ID":"88d600dd-1b0b-4e33-a91f-4375318fdc5f","Type":"ContainerStarted","Data":"6462f2435a449d7f3307b577826aa1f43a5057ad4b566ba30899f6e14cf1df55"} Jan 21 16:10:07 crc kubenswrapper[4773]: I0121 16:10:07.917435 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" event={"ID":"88d600dd-1b0b-4e33-a91f-4375318fdc5f","Type":"ContainerStarted","Data":"d38fc267ae20a8f0ee0e45eac671a41c97648626a00d77bc69ebd1da59043852"} Jan 21 16:10:07 crc kubenswrapper[4773]: I0121 16:10:07.939684 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" podStartSLOduration=2.969413237 podStartE2EDuration="4.939663438s" podCreationTimestamp="2026-01-21 16:10:03 +0000 UTC" firstStartedPulling="2026-01-21 16:10:04.860057444 +0000 UTC m=+2769.784547066" lastFinishedPulling="2026-01-21 16:10:06.830307635 +0000 UTC m=+2771.754797267" observedRunningTime="2026-01-21 16:10:07.931339201 +0000 UTC m=+2772.855828823" watchObservedRunningTime="2026-01-21 16:10:07.939663438 +0000 UTC m=+2772.864153060" Jan 21 16:10:25 crc kubenswrapper[4773]: I0121 16:10:25.205594 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:10:25 crc kubenswrapper[4773]: I0121 16:10:25.206170 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:10:55 crc kubenswrapper[4773]: I0121 16:10:55.206193 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:10:55 crc kubenswrapper[4773]: I0121 16:10:55.206913 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:10:55 crc kubenswrapper[4773]: I0121 16:10:55.206972 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 16:10:55 crc kubenswrapper[4773]: I0121 16:10:55.207939 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13c066ce10a9b3578dd357e22edb08641875832f497ce277dbd8bef0aec27aa8"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:10:55 crc kubenswrapper[4773]: I0121 16:10:55.208013 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://13c066ce10a9b3578dd357e22edb08641875832f497ce277dbd8bef0aec27aa8" gracePeriod=600 Jan 21 16:10:55 crc kubenswrapper[4773]: I0121 16:10:55.365759 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="13c066ce10a9b3578dd357e22edb08641875832f497ce277dbd8bef0aec27aa8" exitCode=0 Jan 21 16:10:55 crc kubenswrapper[4773]: I0121 16:10:55.366114 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"13c066ce10a9b3578dd357e22edb08641875832f497ce277dbd8bef0aec27aa8"} Jan 21 16:10:55 crc kubenswrapper[4773]: I0121 16:10:55.366158 4773 scope.go:117] "RemoveContainer" containerID="05c06873f9c17d9d8a409edd3c5451374cc7edf9fad310681c0a1edf8a9342d8" Jan 21 16:10:56 crc kubenswrapper[4773]: I0121 16:10:56.379529 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d"} Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.394198 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-85bnp"] Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.397154 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.409524 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-85bnp"] Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.430422 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n44h\" (UniqueName: \"kubernetes.io/projected/1c61ea1f-15f2-4150-801c-00da5c5a3a34-kube-api-access-6n44h\") pod \"redhat-marketplace-85bnp\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.430608 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-utilities\") pod \"redhat-marketplace-85bnp\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.430675 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-catalog-content\") pod \"redhat-marketplace-85bnp\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.532244 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-utilities\") pod \"redhat-marketplace-85bnp\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.532589 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-catalog-content\") pod \"redhat-marketplace-85bnp\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.532688 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n44h\" (UniqueName: \"kubernetes.io/projected/1c61ea1f-15f2-4150-801c-00da5c5a3a34-kube-api-access-6n44h\") pod \"redhat-marketplace-85bnp\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.533469 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-utilities\") pod \"redhat-marketplace-85bnp\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.533821 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-catalog-content\") pod \"redhat-marketplace-85bnp\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.553151 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n44h\" (UniqueName: \"kubernetes.io/projected/1c61ea1f-15f2-4150-801c-00da5c5a3a34-kube-api-access-6n44h\") pod \"redhat-marketplace-85bnp\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:00 crc kubenswrapper[4773]: I0121 16:11:00.724738 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:01 crc kubenswrapper[4773]: I0121 16:11:01.293632 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-85bnp"] Jan 21 16:11:01 crc kubenswrapper[4773]: I0121 16:11:01.421535 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85bnp" event={"ID":"1c61ea1f-15f2-4150-801c-00da5c5a3a34","Type":"ContainerStarted","Data":"7fbe4fb331a830f71000e272c55dbf9af1a71b85adf4af2fb9aea37f18d6bd5d"} Jan 21 16:11:03 crc kubenswrapper[4773]: I0121 16:11:03.439460 4773 generic.go:334] "Generic (PLEG): container finished" podID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" containerID="249824720f89e312cf1e60496a7bfbf18d78839dcda1badb3cf1fe524259cbdb" exitCode=0 Jan 21 16:11:03 crc kubenswrapper[4773]: I0121 16:11:03.439512 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85bnp" event={"ID":"1c61ea1f-15f2-4150-801c-00da5c5a3a34","Type":"ContainerDied","Data":"249824720f89e312cf1e60496a7bfbf18d78839dcda1badb3cf1fe524259cbdb"} Jan 21 16:11:04 crc kubenswrapper[4773]: I0121 16:11:04.451800 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85bnp" event={"ID":"1c61ea1f-15f2-4150-801c-00da5c5a3a34","Type":"ContainerStarted","Data":"ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb"} Jan 21 16:11:05 crc kubenswrapper[4773]: I0121 16:11:05.462915 4773 generic.go:334] "Generic (PLEG): container finished" podID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" containerID="ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb" exitCode=0 Jan 21 16:11:05 crc kubenswrapper[4773]: I0121 16:11:05.463229 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85bnp" event={"ID":"1c61ea1f-15f2-4150-801c-00da5c5a3a34","Type":"ContainerDied","Data":"ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb"} Jan 21 16:11:06 crc kubenswrapper[4773]: I0121 16:11:06.476330 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85bnp" event={"ID":"1c61ea1f-15f2-4150-801c-00da5c5a3a34","Type":"ContainerStarted","Data":"4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13"} Jan 21 16:11:06 crc kubenswrapper[4773]: I0121 16:11:06.507142 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-85bnp" podStartSLOduration=3.878659046 podStartE2EDuration="6.507120277s" podCreationTimestamp="2026-01-21 16:11:00 +0000 UTC" firstStartedPulling="2026-01-21 16:11:03.4409051 +0000 UTC m=+2828.365394722" lastFinishedPulling="2026-01-21 16:11:06.069366331 +0000 UTC m=+2830.993855953" observedRunningTime="2026-01-21 16:11:06.502588884 +0000 UTC m=+2831.427078516" watchObservedRunningTime="2026-01-21 16:11:06.507120277 +0000 UTC m=+2831.431609899" Jan 21 16:11:10 crc kubenswrapper[4773]: I0121 16:11:10.725100 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:10 crc kubenswrapper[4773]: I0121 16:11:10.726198 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:10 crc kubenswrapper[4773]: I0121 16:11:10.772384 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:11 crc kubenswrapper[4773]: I0121 16:11:11.579861 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:11 crc kubenswrapper[4773]: I0121 16:11:11.636846 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-85bnp"] Jan 21 16:11:13 crc kubenswrapper[4773]: I0121 16:11:13.546576 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-85bnp" podUID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" containerName="registry-server" containerID="cri-o://4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13" gracePeriod=2 Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.084835 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.253018 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-utilities\") pod \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.253211 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n44h\" (UniqueName: \"kubernetes.io/projected/1c61ea1f-15f2-4150-801c-00da5c5a3a34-kube-api-access-6n44h\") pod \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.253316 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-catalog-content\") pod \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\" (UID: \"1c61ea1f-15f2-4150-801c-00da5c5a3a34\") " Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.254242 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-utilities" (OuterVolumeSpecName: "utilities") pod "1c61ea1f-15f2-4150-801c-00da5c5a3a34" (UID: "1c61ea1f-15f2-4150-801c-00da5c5a3a34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.260616 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c61ea1f-15f2-4150-801c-00da5c5a3a34-kube-api-access-6n44h" (OuterVolumeSpecName: "kube-api-access-6n44h") pod "1c61ea1f-15f2-4150-801c-00da5c5a3a34" (UID: "1c61ea1f-15f2-4150-801c-00da5c5a3a34"). InnerVolumeSpecName "kube-api-access-6n44h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.280937 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c61ea1f-15f2-4150-801c-00da5c5a3a34" (UID: "1c61ea1f-15f2-4150-801c-00da5c5a3a34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.356521 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.356563 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n44h\" (UniqueName: \"kubernetes.io/projected/1c61ea1f-15f2-4150-801c-00da5c5a3a34-kube-api-access-6n44h\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.356581 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c61ea1f-15f2-4150-801c-00da5c5a3a34-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.557386 4773 generic.go:334] "Generic (PLEG): container finished" podID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" containerID="4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13" exitCode=0 Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.557496 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85bnp" event={"ID":"1c61ea1f-15f2-4150-801c-00da5c5a3a34","Type":"ContainerDied","Data":"4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13"} Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.558571 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-85bnp" event={"ID":"1c61ea1f-15f2-4150-801c-00da5c5a3a34","Type":"ContainerDied","Data":"7fbe4fb331a830f71000e272c55dbf9af1a71b85adf4af2fb9aea37f18d6bd5d"} Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.558597 4773 scope.go:117] "RemoveContainer" containerID="4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.557516 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-85bnp" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.577934 4773 scope.go:117] "RemoveContainer" containerID="ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.597915 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-85bnp"] Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.609875 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-85bnp"] Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.615052 4773 scope.go:117] "RemoveContainer" containerID="249824720f89e312cf1e60496a7bfbf18d78839dcda1badb3cf1fe524259cbdb" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.655939 4773 scope.go:117] "RemoveContainer" containerID="4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13" Jan 21 16:11:14 crc kubenswrapper[4773]: E0121 16:11:14.656528 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13\": container with ID starting with 4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13 not found: ID does not exist" containerID="4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.656568 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13"} err="failed to get container status \"4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13\": rpc error: code = NotFound desc = could not find container \"4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13\": container with ID starting with 4a0a6907a3fd3de4bb033bb37844202b73c7440d0ee310d35a16e51e7ffc3e13 not found: ID does not exist" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.656604 4773 scope.go:117] "RemoveContainer" containerID="ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb" Jan 21 16:11:14 crc kubenswrapper[4773]: E0121 16:11:14.656974 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb\": container with ID starting with ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb not found: ID does not exist" containerID="ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.657014 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb"} err="failed to get container status \"ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb\": rpc error: code = NotFound desc = could not find container \"ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb\": container with ID starting with ea6d7b853f6e69518f2c3e7676abf764ab20e58a0dd17997f05046ae62830deb not found: ID does not exist" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.657041 4773 scope.go:117] "RemoveContainer" containerID="249824720f89e312cf1e60496a7bfbf18d78839dcda1badb3cf1fe524259cbdb" Jan 21 16:11:14 crc kubenswrapper[4773]: E0121 16:11:14.657409 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"249824720f89e312cf1e60496a7bfbf18d78839dcda1badb3cf1fe524259cbdb\": container with ID starting with 249824720f89e312cf1e60496a7bfbf18d78839dcda1badb3cf1fe524259cbdb not found: ID does not exist" containerID="249824720f89e312cf1e60496a7bfbf18d78839dcda1badb3cf1fe524259cbdb" Jan 21 16:11:14 crc kubenswrapper[4773]: I0121 16:11:14.657433 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249824720f89e312cf1e60496a7bfbf18d78839dcda1badb3cf1fe524259cbdb"} err="failed to get container status \"249824720f89e312cf1e60496a7bfbf18d78839dcda1badb3cf1fe524259cbdb\": rpc error: code = NotFound desc = could not find container \"249824720f89e312cf1e60496a7bfbf18d78839dcda1badb3cf1fe524259cbdb\": container with ID starting with 249824720f89e312cf1e60496a7bfbf18d78839dcda1badb3cf1fe524259cbdb not found: ID does not exist" Jan 21 16:11:15 crc kubenswrapper[4773]: I0121 16:11:15.410998 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" path="/var/lib/kubelet/pods/1c61ea1f-15f2-4150-801c-00da5c5a3a34/volumes" Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.702816 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h9mpp"] Jan 21 16:11:55 crc kubenswrapper[4773]: E0121 16:11:55.707648 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" containerName="extract-content" Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.707709 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" containerName="extract-content" Jan 21 16:11:55 crc kubenswrapper[4773]: E0121 16:11:55.707752 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" containerName="extract-utilities" Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.707762 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" containerName="extract-utilities" Jan 21 16:11:55 crc kubenswrapper[4773]: E0121 16:11:55.707774 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" containerName="registry-server" Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.707782 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" containerName="registry-server" Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.708030 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c61ea1f-15f2-4150-801c-00da5c5a3a34" containerName="registry-server" Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.710178 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.719926 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9mpp"] Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.898252 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-458mc\" (UniqueName: \"kubernetes.io/projected/c021462e-b579-488a-aaf6-75617397232e-kube-api-access-458mc\") pod \"certified-operators-h9mpp\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.898332 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-utilities\") pod \"certified-operators-h9mpp\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.898890 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-catalog-content\") pod \"certified-operators-h9mpp\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.903615 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cdk8v"] Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.906120 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:11:55 crc kubenswrapper[4773]: I0121 16:11:55.922466 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cdk8v"] Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.002372 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-catalog-content\") pod \"certified-operators-h9mpp\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.002635 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-458mc\" (UniqueName: \"kubernetes.io/projected/c021462e-b579-488a-aaf6-75617397232e-kube-api-access-458mc\") pod \"certified-operators-h9mpp\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.002724 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-utilities\") pod \"certified-operators-h9mpp\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.007869 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-utilities\") pod \"certified-operators-h9mpp\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.011722 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-catalog-content\") pod \"certified-operators-h9mpp\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.029998 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-458mc\" (UniqueName: \"kubernetes.io/projected/c021462e-b579-488a-aaf6-75617397232e-kube-api-access-458mc\") pod \"certified-operators-h9mpp\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.048219 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.105271 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-utilities\") pod \"community-operators-cdk8v\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.105386 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-catalog-content\") pod \"community-operators-cdk8v\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.105436 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rs2s\" (UniqueName: \"kubernetes.io/projected/d236deff-5213-451c-9d60-403fc48d2864-kube-api-access-4rs2s\") pod \"community-operators-cdk8v\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.206945 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-utilities\") pod \"community-operators-cdk8v\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.207129 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-catalog-content\") pod \"community-operators-cdk8v\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.207193 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rs2s\" (UniqueName: \"kubernetes.io/projected/d236deff-5213-451c-9d60-403fc48d2864-kube-api-access-4rs2s\") pod \"community-operators-cdk8v\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.207612 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-utilities\") pod \"community-operators-cdk8v\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.207662 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-catalog-content\") pod \"community-operators-cdk8v\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.234524 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rs2s\" (UniqueName: \"kubernetes.io/projected/d236deff-5213-451c-9d60-403fc48d2864-kube-api-access-4rs2s\") pod \"community-operators-cdk8v\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.522903 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:11:56 crc kubenswrapper[4773]: I0121 16:11:56.789388 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9mpp"] Jan 21 16:11:56 crc kubenswrapper[4773]: W0121 16:11:56.802929 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc021462e_b579_488a_aaf6_75617397232e.slice/crio-43d46187daa6b67d0efdbe941aedf0434b499f022ec3933688b73c5eef08cf38 WatchSource:0}: Error finding container 43d46187daa6b67d0efdbe941aedf0434b499f022ec3933688b73c5eef08cf38: Status 404 returned error can't find the container with id 43d46187daa6b67d0efdbe941aedf0434b499f022ec3933688b73c5eef08cf38 Jan 21 16:11:57 crc kubenswrapper[4773]: I0121 16:11:57.007592 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mpp" event={"ID":"c021462e-b579-488a-aaf6-75617397232e","Type":"ContainerStarted","Data":"43d46187daa6b67d0efdbe941aedf0434b499f022ec3933688b73c5eef08cf38"} Jan 21 16:11:57 crc kubenswrapper[4773]: I0121 16:11:57.772314 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cdk8v"] Jan 21 16:11:58 crc kubenswrapper[4773]: I0121 16:11:58.019805 4773 generic.go:334] "Generic (PLEG): container finished" podID="d236deff-5213-451c-9d60-403fc48d2864" containerID="fbbcef652b19c5e2e8c66249592ccc370c1e85b4c809c659b8392942d5ea8714" exitCode=0 Jan 21 16:11:58 crc kubenswrapper[4773]: I0121 16:11:58.019911 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdk8v" event={"ID":"d236deff-5213-451c-9d60-403fc48d2864","Type":"ContainerDied","Data":"fbbcef652b19c5e2e8c66249592ccc370c1e85b4c809c659b8392942d5ea8714"} Jan 21 16:11:58 crc kubenswrapper[4773]: I0121 16:11:58.020193 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdk8v" event={"ID":"d236deff-5213-451c-9d60-403fc48d2864","Type":"ContainerStarted","Data":"0948259241c81598003a991605912c4fec588bd10e6547fcc9bcde7feb2dc485"} Jan 21 16:11:58 crc kubenswrapper[4773]: I0121 16:11:58.024108 4773 generic.go:334] "Generic (PLEG): container finished" podID="c021462e-b579-488a-aaf6-75617397232e" containerID="664eccecc7de99df1fc4b84a322c4f81aa25039c46d222253393cc8301b4674c" exitCode=0 Jan 21 16:11:58 crc kubenswrapper[4773]: I0121 16:11:58.024150 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mpp" event={"ID":"c021462e-b579-488a-aaf6-75617397232e","Type":"ContainerDied","Data":"664eccecc7de99df1fc4b84a322c4f81aa25039c46d222253393cc8301b4674c"} Jan 21 16:11:59 crc kubenswrapper[4773]: I0121 16:11:59.034517 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdk8v" event={"ID":"d236deff-5213-451c-9d60-403fc48d2864","Type":"ContainerStarted","Data":"d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4"} Jan 21 16:12:00 crc kubenswrapper[4773]: I0121 16:12:00.049951 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mpp" event={"ID":"c021462e-b579-488a-aaf6-75617397232e","Type":"ContainerStarted","Data":"35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03"} Jan 21 16:12:01 crc kubenswrapper[4773]: I0121 16:12:01.061791 4773 generic.go:334] "Generic (PLEG): container finished" podID="d236deff-5213-451c-9d60-403fc48d2864" containerID="d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4" exitCode=0 Jan 21 16:12:01 crc kubenswrapper[4773]: I0121 16:12:01.061897 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdk8v" event={"ID":"d236deff-5213-451c-9d60-403fc48d2864","Type":"ContainerDied","Data":"d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4"} Jan 21 16:12:03 crc kubenswrapper[4773]: I0121 16:12:03.082268 4773 generic.go:334] "Generic (PLEG): container finished" podID="c021462e-b579-488a-aaf6-75617397232e" containerID="35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03" exitCode=0 Jan 21 16:12:03 crc kubenswrapper[4773]: I0121 16:12:03.082531 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mpp" event={"ID":"c021462e-b579-488a-aaf6-75617397232e","Type":"ContainerDied","Data":"35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03"} Jan 21 16:12:03 crc kubenswrapper[4773]: I0121 16:12:03.088716 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdk8v" event={"ID":"d236deff-5213-451c-9d60-403fc48d2864","Type":"ContainerStarted","Data":"09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f"} Jan 21 16:12:03 crc kubenswrapper[4773]: I0121 16:12:03.128540 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cdk8v" podStartSLOduration=3.917059012 podStartE2EDuration="8.128499663s" podCreationTimestamp="2026-01-21 16:11:55 +0000 UTC" firstStartedPulling="2026-01-21 16:11:58.021916487 +0000 UTC m=+2882.946406109" lastFinishedPulling="2026-01-21 16:12:02.233357138 +0000 UTC m=+2887.157846760" observedRunningTime="2026-01-21 16:12:03.128178464 +0000 UTC m=+2888.052668086" watchObservedRunningTime="2026-01-21 16:12:03.128499663 +0000 UTC m=+2888.052989285" Jan 21 16:12:06 crc kubenswrapper[4773]: I0121 16:12:06.134630 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mpp" event={"ID":"c021462e-b579-488a-aaf6-75617397232e","Type":"ContainerStarted","Data":"51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f"} Jan 21 16:12:06 crc kubenswrapper[4773]: I0121 16:12:06.151592 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h9mpp" podStartSLOduration=4.268691614 podStartE2EDuration="11.151571741s" podCreationTimestamp="2026-01-21 16:11:55 +0000 UTC" firstStartedPulling="2026-01-21 16:11:58.025676419 +0000 UTC m=+2882.950166041" lastFinishedPulling="2026-01-21 16:12:04.908556546 +0000 UTC m=+2889.833046168" observedRunningTime="2026-01-21 16:12:06.150871062 +0000 UTC m=+2891.075360694" watchObservedRunningTime="2026-01-21 16:12:06.151571741 +0000 UTC m=+2891.076061363" Jan 21 16:12:06 crc kubenswrapper[4773]: I0121 16:12:06.524048 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:12:06 crc kubenswrapper[4773]: I0121 16:12:06.524406 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:12:06 crc kubenswrapper[4773]: I0121 16:12:06.583538 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:12:07 crc kubenswrapper[4773]: I0121 16:12:07.188917 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:12:08 crc kubenswrapper[4773]: I0121 16:12:08.492130 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cdk8v"] Jan 21 16:12:09 crc kubenswrapper[4773]: I0121 16:12:09.167817 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cdk8v" podUID="d236deff-5213-451c-9d60-403fc48d2864" containerName="registry-server" containerID="cri-o://09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f" gracePeriod=2 Jan 21 16:12:09 crc kubenswrapper[4773]: I0121 16:12:09.717085 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:12:09 crc kubenswrapper[4773]: I0121 16:12:09.825577 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rs2s\" (UniqueName: \"kubernetes.io/projected/d236deff-5213-451c-9d60-403fc48d2864-kube-api-access-4rs2s\") pod \"d236deff-5213-451c-9d60-403fc48d2864\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " Jan 21 16:12:09 crc kubenswrapper[4773]: I0121 16:12:09.825741 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-utilities\") pod \"d236deff-5213-451c-9d60-403fc48d2864\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " Jan 21 16:12:09 crc kubenswrapper[4773]: I0121 16:12:09.825893 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-catalog-content\") pod \"d236deff-5213-451c-9d60-403fc48d2864\" (UID: \"d236deff-5213-451c-9d60-403fc48d2864\") " Jan 21 16:12:09 crc kubenswrapper[4773]: I0121 16:12:09.826999 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-utilities" (OuterVolumeSpecName: "utilities") pod "d236deff-5213-451c-9d60-403fc48d2864" (UID: "d236deff-5213-451c-9d60-403fc48d2864"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:09 crc kubenswrapper[4773]: I0121 16:12:09.827856 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:09 crc kubenswrapper[4773]: I0121 16:12:09.831964 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d236deff-5213-451c-9d60-403fc48d2864-kube-api-access-4rs2s" (OuterVolumeSpecName: "kube-api-access-4rs2s") pod "d236deff-5213-451c-9d60-403fc48d2864" (UID: "d236deff-5213-451c-9d60-403fc48d2864"). InnerVolumeSpecName "kube-api-access-4rs2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:09 crc kubenswrapper[4773]: I0121 16:12:09.880930 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d236deff-5213-451c-9d60-403fc48d2864" (UID: "d236deff-5213-451c-9d60-403fc48d2864"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:09 crc kubenswrapper[4773]: I0121 16:12:09.930253 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d236deff-5213-451c-9d60-403fc48d2864-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:09 crc kubenswrapper[4773]: I0121 16:12:09.930294 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rs2s\" (UniqueName: \"kubernetes.io/projected/d236deff-5213-451c-9d60-403fc48d2864-kube-api-access-4rs2s\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.182277 4773 generic.go:334] "Generic (PLEG): container finished" podID="d236deff-5213-451c-9d60-403fc48d2864" containerID="09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f" exitCode=0 Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.182444 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdk8v" event={"ID":"d236deff-5213-451c-9d60-403fc48d2864","Type":"ContainerDied","Data":"09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f"} Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.182969 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdk8v" event={"ID":"d236deff-5213-451c-9d60-403fc48d2864","Type":"ContainerDied","Data":"0948259241c81598003a991605912c4fec588bd10e6547fcc9bcde7feb2dc485"} Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.183037 4773 scope.go:117] "RemoveContainer" containerID="09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f" Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.182540 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdk8v" Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.213287 4773 scope.go:117] "RemoveContainer" containerID="d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4" Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.225178 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cdk8v"] Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.234965 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cdk8v"] Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.248615 4773 scope.go:117] "RemoveContainer" containerID="fbbcef652b19c5e2e8c66249592ccc370c1e85b4c809c659b8392942d5ea8714" Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.294531 4773 scope.go:117] "RemoveContainer" containerID="09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f" Jan 21 16:12:10 crc kubenswrapper[4773]: E0121 16:12:10.295093 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f\": container with ID starting with 09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f not found: ID does not exist" containerID="09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f" Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.295155 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f"} err="failed to get container status \"09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f\": rpc error: code = NotFound desc = could not find container \"09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f\": container with ID starting with 09751571b3bb5c97b0dd400371232f7830a6c21b70489a6887c16486c61d941f not found: ID does not exist" Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.295190 4773 scope.go:117] "RemoveContainer" containerID="d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4" Jan 21 16:12:10 crc kubenswrapper[4773]: E0121 16:12:10.295541 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4\": container with ID starting with d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4 not found: ID does not exist" containerID="d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4" Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.295580 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4"} err="failed to get container status \"d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4\": rpc error: code = NotFound desc = could not find container \"d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4\": container with ID starting with d1a1f9bf4b09b805f3f4cb7f338ecc05de1ba65b182136917df10b67655c29e4 not found: ID does not exist" Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.295612 4773 scope.go:117] "RemoveContainer" containerID="fbbcef652b19c5e2e8c66249592ccc370c1e85b4c809c659b8392942d5ea8714" Jan 21 16:12:10 crc kubenswrapper[4773]: E0121 16:12:10.295897 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbbcef652b19c5e2e8c66249592ccc370c1e85b4c809c659b8392942d5ea8714\": container with ID starting with fbbcef652b19c5e2e8c66249592ccc370c1e85b4c809c659b8392942d5ea8714 not found: ID does not exist" containerID="fbbcef652b19c5e2e8c66249592ccc370c1e85b4c809c659b8392942d5ea8714" Jan 21 16:12:10 crc kubenswrapper[4773]: I0121 16:12:10.295925 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbbcef652b19c5e2e8c66249592ccc370c1e85b4c809c659b8392942d5ea8714"} err="failed to get container status \"fbbcef652b19c5e2e8c66249592ccc370c1e85b4c809c659b8392942d5ea8714\": rpc error: code = NotFound desc = could not find container \"fbbcef652b19c5e2e8c66249592ccc370c1e85b4c809c659b8392942d5ea8714\": container with ID starting with fbbcef652b19c5e2e8c66249592ccc370c1e85b4c809c659b8392942d5ea8714 not found: ID does not exist" Jan 21 16:12:11 crc kubenswrapper[4773]: I0121 16:12:11.395008 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d236deff-5213-451c-9d60-403fc48d2864" path="/var/lib/kubelet/pods/d236deff-5213-451c-9d60-403fc48d2864/volumes" Jan 21 16:12:16 crc kubenswrapper[4773]: I0121 16:12:16.049901 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:12:16 crc kubenswrapper[4773]: I0121 16:12:16.050665 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:12:16 crc kubenswrapper[4773]: I0121 16:12:16.096867 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:12:16 crc kubenswrapper[4773]: I0121 16:12:16.299090 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:12:16 crc kubenswrapper[4773]: I0121 16:12:16.354024 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9mpp"] Jan 21 16:12:18 crc kubenswrapper[4773]: I0121 16:12:18.259454 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h9mpp" podUID="c021462e-b579-488a-aaf6-75617397232e" containerName="registry-server" containerID="cri-o://51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f" gracePeriod=2 Jan 21 16:12:18 crc kubenswrapper[4773]: I0121 16:12:18.856095 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:12:18 crc kubenswrapper[4773]: I0121 16:12:18.920684 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-catalog-content\") pod \"c021462e-b579-488a-aaf6-75617397232e\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " Jan 21 16:12:18 crc kubenswrapper[4773]: I0121 16:12:18.920868 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-utilities\") pod \"c021462e-b579-488a-aaf6-75617397232e\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " Jan 21 16:12:18 crc kubenswrapper[4773]: I0121 16:12:18.920896 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-458mc\" (UniqueName: \"kubernetes.io/projected/c021462e-b579-488a-aaf6-75617397232e-kube-api-access-458mc\") pod \"c021462e-b579-488a-aaf6-75617397232e\" (UID: \"c021462e-b579-488a-aaf6-75617397232e\") " Jan 21 16:12:18 crc kubenswrapper[4773]: I0121 16:12:18.924792 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-utilities" (OuterVolumeSpecName: "utilities") pod "c021462e-b579-488a-aaf6-75617397232e" (UID: "c021462e-b579-488a-aaf6-75617397232e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:18 crc kubenswrapper[4773]: I0121 16:12:18.929987 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c021462e-b579-488a-aaf6-75617397232e-kube-api-access-458mc" (OuterVolumeSpecName: "kube-api-access-458mc") pod "c021462e-b579-488a-aaf6-75617397232e" (UID: "c021462e-b579-488a-aaf6-75617397232e"). InnerVolumeSpecName "kube-api-access-458mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:18 crc kubenswrapper[4773]: I0121 16:12:18.970075 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c021462e-b579-488a-aaf6-75617397232e" (UID: "c021462e-b579-488a-aaf6-75617397232e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.024150 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-458mc\" (UniqueName: \"kubernetes.io/projected/c021462e-b579-488a-aaf6-75617397232e-kube-api-access-458mc\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.024203 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.024217 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c021462e-b579-488a-aaf6-75617397232e-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.269966 4773 generic.go:334] "Generic (PLEG): container finished" podID="c021462e-b579-488a-aaf6-75617397232e" containerID="51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f" exitCode=0 Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.270018 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mpp" event={"ID":"c021462e-b579-488a-aaf6-75617397232e","Type":"ContainerDied","Data":"51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f"} Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.270079 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mpp" event={"ID":"c021462e-b579-488a-aaf6-75617397232e","Type":"ContainerDied","Data":"43d46187daa6b67d0efdbe941aedf0434b499f022ec3933688b73c5eef08cf38"} Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.270100 4773 scope.go:117] "RemoveContainer" containerID="51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.270040 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9mpp" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.299511 4773 scope.go:117] "RemoveContainer" containerID="35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.308914 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9mpp"] Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.317915 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h9mpp"] Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.359450 4773 scope.go:117] "RemoveContainer" containerID="664eccecc7de99df1fc4b84a322c4f81aa25039c46d222253393cc8301b4674c" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.381385 4773 scope.go:117] "RemoveContainer" containerID="51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f" Jan 21 16:12:19 crc kubenswrapper[4773]: E0121 16:12:19.381911 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f\": container with ID starting with 51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f not found: ID does not exist" containerID="51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.381962 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f"} err="failed to get container status \"51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f\": rpc error: code = NotFound desc = could not find container \"51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f\": container with ID starting with 51e8cc42a2f436c92d53789312d96311c0ff86654864a6b280c7ab5279ef748f not found: ID does not exist" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.381993 4773 scope.go:117] "RemoveContainer" containerID="35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03" Jan 21 16:12:19 crc kubenswrapper[4773]: E0121 16:12:19.382293 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03\": container with ID starting with 35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03 not found: ID does not exist" containerID="35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.382343 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03"} err="failed to get container status \"35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03\": rpc error: code = NotFound desc = could not find container \"35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03\": container with ID starting with 35959eb7b4e6fa7f941740ac11c62342a8c392ba3ee0431158ff4c9c543b8f03 not found: ID does not exist" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.382400 4773 scope.go:117] "RemoveContainer" containerID="664eccecc7de99df1fc4b84a322c4f81aa25039c46d222253393cc8301b4674c" Jan 21 16:12:19 crc kubenswrapper[4773]: E0121 16:12:19.382864 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"664eccecc7de99df1fc4b84a322c4f81aa25039c46d222253393cc8301b4674c\": container with ID starting with 664eccecc7de99df1fc4b84a322c4f81aa25039c46d222253393cc8301b4674c not found: ID does not exist" containerID="664eccecc7de99df1fc4b84a322c4f81aa25039c46d222253393cc8301b4674c" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.382892 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664eccecc7de99df1fc4b84a322c4f81aa25039c46d222253393cc8301b4674c"} err="failed to get container status \"664eccecc7de99df1fc4b84a322c4f81aa25039c46d222253393cc8301b4674c\": rpc error: code = NotFound desc = could not find container \"664eccecc7de99df1fc4b84a322c4f81aa25039c46d222253393cc8301b4674c\": container with ID starting with 664eccecc7de99df1fc4b84a322c4f81aa25039c46d222253393cc8301b4674c not found: ID does not exist" Jan 21 16:12:19 crc kubenswrapper[4773]: I0121 16:12:19.396800 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c021462e-b579-488a-aaf6-75617397232e" path="/var/lib/kubelet/pods/c021462e-b579-488a-aaf6-75617397232e/volumes" Jan 21 16:12:32 crc kubenswrapper[4773]: I0121 16:12:32.403607 4773 generic.go:334] "Generic (PLEG): container finished" podID="88d600dd-1b0b-4e33-a91f-4375318fdc5f" containerID="d38fc267ae20a8f0ee0e45eac671a41c97648626a00d77bc69ebd1da59043852" exitCode=0 Jan 21 16:12:32 crc kubenswrapper[4773]: I0121 16:12:32.403664 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" event={"ID":"88d600dd-1b0b-4e33-a91f-4375318fdc5f","Type":"ContainerDied","Data":"d38fc267ae20a8f0ee0e45eac671a41c97648626a00d77bc69ebd1da59043852"} Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.936488 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.967402 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-0\") pod \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.967447 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-1\") pod \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.967468 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-extra-config-0\") pod \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.967513 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-ssh-key-openstack-edpm-ipam\") pod \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.967543 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-combined-ca-bundle\") pod \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.967597 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdcdq\" (UniqueName: \"kubernetes.io/projected/88d600dd-1b0b-4e33-a91f-4375318fdc5f-kube-api-access-fdcdq\") pod \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.967647 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-inventory\") pod \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.967768 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-0\") pod \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.967877 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-1\") pod \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\" (UID: \"88d600dd-1b0b-4e33-a91f-4375318fdc5f\") " Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.983010 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "88d600dd-1b0b-4e33-a91f-4375318fdc5f" (UID: "88d600dd-1b0b-4e33-a91f-4375318fdc5f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.996044 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88d600dd-1b0b-4e33-a91f-4375318fdc5f-kube-api-access-fdcdq" (OuterVolumeSpecName: "kube-api-access-fdcdq") pod "88d600dd-1b0b-4e33-a91f-4375318fdc5f" (UID: "88d600dd-1b0b-4e33-a91f-4375318fdc5f"). InnerVolumeSpecName "kube-api-access-fdcdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:12:33 crc kubenswrapper[4773]: I0121 16:12:33.998404 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "88d600dd-1b0b-4e33-a91f-4375318fdc5f" (UID: "88d600dd-1b0b-4e33-a91f-4375318fdc5f"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.001995 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "88d600dd-1b0b-4e33-a91f-4375318fdc5f" (UID: "88d600dd-1b0b-4e33-a91f-4375318fdc5f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.002993 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "88d600dd-1b0b-4e33-a91f-4375318fdc5f" (UID: "88d600dd-1b0b-4e33-a91f-4375318fdc5f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.013381 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "88d600dd-1b0b-4e33-a91f-4375318fdc5f" (UID: "88d600dd-1b0b-4e33-a91f-4375318fdc5f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.018148 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "88d600dd-1b0b-4e33-a91f-4375318fdc5f" (UID: "88d600dd-1b0b-4e33-a91f-4375318fdc5f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.019795 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "88d600dd-1b0b-4e33-a91f-4375318fdc5f" (UID: "88d600dd-1b0b-4e33-a91f-4375318fdc5f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.022756 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-inventory" (OuterVolumeSpecName: "inventory") pod "88d600dd-1b0b-4e33-a91f-4375318fdc5f" (UID: "88d600dd-1b0b-4e33-a91f-4375318fdc5f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.069561 4773 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.069603 4773 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.069620 4773 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.069631 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.069643 4773 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.069655 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdcdq\" (UniqueName: \"kubernetes.io/projected/88d600dd-1b0b-4e33-a91f-4375318fdc5f-kube-api-access-fdcdq\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.069665 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.069675 4773 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.069685 4773 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/88d600dd-1b0b-4e33-a91f-4375318fdc5f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.423554 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" event={"ID":"88d600dd-1b0b-4e33-a91f-4375318fdc5f","Type":"ContainerDied","Data":"6462f2435a449d7f3307b577826aa1f43a5057ad4b566ba30899f6e14cf1df55"} Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.424153 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6462f2435a449d7f3307b577826aa1f43a5057ad4b566ba30899f6e14cf1df55" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.423629 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cvdbt" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.532248 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6"] Jan 21 16:12:34 crc kubenswrapper[4773]: E0121 16:12:34.532918 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d236deff-5213-451c-9d60-403fc48d2864" containerName="extract-utilities" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.532938 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d236deff-5213-451c-9d60-403fc48d2864" containerName="extract-utilities" Jan 21 16:12:34 crc kubenswrapper[4773]: E0121 16:12:34.532958 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88d600dd-1b0b-4e33-a91f-4375318fdc5f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.532967 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d600dd-1b0b-4e33-a91f-4375318fdc5f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 21 16:12:34 crc kubenswrapper[4773]: E0121 16:12:34.532988 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d236deff-5213-451c-9d60-403fc48d2864" containerName="registry-server" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.532996 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d236deff-5213-451c-9d60-403fc48d2864" containerName="registry-server" Jan 21 16:12:34 crc kubenswrapper[4773]: E0121 16:12:34.533013 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c021462e-b579-488a-aaf6-75617397232e" containerName="extract-utilities" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.533021 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c021462e-b579-488a-aaf6-75617397232e" containerName="extract-utilities" Jan 21 16:12:34 crc kubenswrapper[4773]: E0121 16:12:34.533043 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d236deff-5213-451c-9d60-403fc48d2864" containerName="extract-content" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.533052 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d236deff-5213-451c-9d60-403fc48d2864" containerName="extract-content" Jan 21 16:12:34 crc kubenswrapper[4773]: E0121 16:12:34.533064 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c021462e-b579-488a-aaf6-75617397232e" containerName="registry-server" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.533072 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c021462e-b579-488a-aaf6-75617397232e" containerName="registry-server" Jan 21 16:12:34 crc kubenswrapper[4773]: E0121 16:12:34.533089 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c021462e-b579-488a-aaf6-75617397232e" containerName="extract-content" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.533097 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c021462e-b579-488a-aaf6-75617397232e" containerName="extract-content" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.533365 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c021462e-b579-488a-aaf6-75617397232e" containerName="registry-server" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.533388 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d236deff-5213-451c-9d60-403fc48d2864" containerName="registry-server" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.533405 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="88d600dd-1b0b-4e33-a91f-4375318fdc5f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.534316 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.539333 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.539507 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.539508 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxlck" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.539802 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.540922 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.544161 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6"] Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.580030 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.580095 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.580121 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfk67\" (UniqueName: \"kubernetes.io/projected/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-kube-api-access-kfk67\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.580225 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.580259 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.580301 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.580319 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.682207 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.682285 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.682309 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfk67\" (UniqueName: \"kubernetes.io/projected/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-kube-api-access-kfk67\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.682387 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.682432 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.682474 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.682508 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.698537 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.698573 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.698537 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.698979 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.699384 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.699502 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.716496 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfk67\" (UniqueName: \"kubernetes.io/projected/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-kube-api-access-kfk67\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:34 crc kubenswrapper[4773]: I0121 16:12:34.860827 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:12:35 crc kubenswrapper[4773]: I0121 16:12:35.404767 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6"] Jan 21 16:12:35 crc kubenswrapper[4773]: I0121 16:12:35.437532 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" event={"ID":"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c","Type":"ContainerStarted","Data":"d5bf4075f555c2e5dfb0d74d6792c5f8f102856b53a071acebd3ed496c568b59"} Jan 21 16:12:36 crc kubenswrapper[4773]: I0121 16:12:36.447032 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" event={"ID":"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c","Type":"ContainerStarted","Data":"3c3199376958955d04a6913dfeb05e85c439e13670d9e7e6d69350a3cc70344c"} Jan 21 16:12:37 crc kubenswrapper[4773]: I0121 16:12:37.495968 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" podStartSLOduration=2.7352122789999997 podStartE2EDuration="3.495948341s" podCreationTimestamp="2026-01-21 16:12:34 +0000 UTC" firstStartedPulling="2026-01-21 16:12:35.410465468 +0000 UTC m=+2920.334955090" lastFinishedPulling="2026-01-21 16:12:36.17120153 +0000 UTC m=+2921.095691152" observedRunningTime="2026-01-21 16:12:37.48191107 +0000 UTC m=+2922.406400692" watchObservedRunningTime="2026-01-21 16:12:37.495948341 +0000 UTC m=+2922.420437963" Jan 21 16:12:55 crc kubenswrapper[4773]: I0121 16:12:55.206078 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:12:55 crc kubenswrapper[4773]: I0121 16:12:55.207013 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:13:25 crc kubenswrapper[4773]: I0121 16:13:25.205564 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:13:25 crc kubenswrapper[4773]: I0121 16:13:25.206162 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:13:55 crc kubenswrapper[4773]: I0121 16:13:55.205774 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:13:55 crc kubenswrapper[4773]: I0121 16:13:55.206368 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:13:55 crc kubenswrapper[4773]: I0121 16:13:55.206445 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 16:13:55 crc kubenswrapper[4773]: I0121 16:13:55.207297 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:13:55 crc kubenswrapper[4773]: I0121 16:13:55.207375 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" gracePeriod=600 Jan 21 16:13:55 crc kubenswrapper[4773]: E0121 16:13:55.833900 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:13:56 crc kubenswrapper[4773]: I0121 16:13:56.245116 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" exitCode=0 Jan 21 16:13:56 crc kubenswrapper[4773]: I0121 16:13:56.245163 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d"} Jan 21 16:13:56 crc kubenswrapper[4773]: I0121 16:13:56.245230 4773 scope.go:117] "RemoveContainer" containerID="13c066ce10a9b3578dd357e22edb08641875832f497ce277dbd8bef0aec27aa8" Jan 21 16:13:56 crc kubenswrapper[4773]: I0121 16:13:56.246033 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:13:56 crc kubenswrapper[4773]: E0121 16:13:56.246364 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:14:10 crc kubenswrapper[4773]: I0121 16:14:10.383974 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:14:10 crc kubenswrapper[4773]: E0121 16:14:10.384793 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:14:23 crc kubenswrapper[4773]: I0121 16:14:23.384036 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:14:23 crc kubenswrapper[4773]: E0121 16:14:23.384923 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:14:34 crc kubenswrapper[4773]: I0121 16:14:34.383988 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:14:34 crc kubenswrapper[4773]: E0121 16:14:34.384746 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:14:46 crc kubenswrapper[4773]: I0121 16:14:46.384091 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:14:46 crc kubenswrapper[4773]: E0121 16:14:46.385521 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.159856 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6"] Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.162334 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.165166 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.166537 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.175131 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6"] Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.272795 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtq47\" (UniqueName: \"kubernetes.io/projected/ffd839d1-7464-4060-98c8-b22f706471ff-kube-api-access-jtq47\") pod \"collect-profiles-29483535-znvl6\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.272948 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd839d1-7464-4060-98c8-b22f706471ff-config-volume\") pod \"collect-profiles-29483535-znvl6\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.273058 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffd839d1-7464-4060-98c8-b22f706471ff-secret-volume\") pod \"collect-profiles-29483535-znvl6\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.374818 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtq47\" (UniqueName: \"kubernetes.io/projected/ffd839d1-7464-4060-98c8-b22f706471ff-kube-api-access-jtq47\") pod \"collect-profiles-29483535-znvl6\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.374920 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd839d1-7464-4060-98c8-b22f706471ff-config-volume\") pod \"collect-profiles-29483535-znvl6\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.374990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffd839d1-7464-4060-98c8-b22f706471ff-secret-volume\") pod \"collect-profiles-29483535-znvl6\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.376068 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd839d1-7464-4060-98c8-b22f706471ff-config-volume\") pod \"collect-profiles-29483535-znvl6\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.383022 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffd839d1-7464-4060-98c8-b22f706471ff-secret-volume\") pod \"collect-profiles-29483535-znvl6\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.392464 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtq47\" (UniqueName: \"kubernetes.io/projected/ffd839d1-7464-4060-98c8-b22f706471ff-kube-api-access-jtq47\") pod \"collect-profiles-29483535-znvl6\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.487670 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:00 crc kubenswrapper[4773]: I0121 16:15:00.980285 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6"] Jan 21 16:15:01 crc kubenswrapper[4773]: I0121 16:15:01.390988 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:15:01 crc kubenswrapper[4773]: E0121 16:15:01.392027 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:15:01 crc kubenswrapper[4773]: I0121 16:15:01.856428 4773 generic.go:334] "Generic (PLEG): container finished" podID="ffd839d1-7464-4060-98c8-b22f706471ff" containerID="60df97948b8ad659b2694b77b9238efff19c2705a954b78a911970aeb9b459e6" exitCode=0 Jan 21 16:15:01 crc kubenswrapper[4773]: I0121 16:15:01.856528 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" event={"ID":"ffd839d1-7464-4060-98c8-b22f706471ff","Type":"ContainerDied","Data":"60df97948b8ad659b2694b77b9238efff19c2705a954b78a911970aeb9b459e6"} Jan 21 16:15:01 crc kubenswrapper[4773]: I0121 16:15:01.857138 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" event={"ID":"ffd839d1-7464-4060-98c8-b22f706471ff","Type":"ContainerStarted","Data":"0021ef28814558256e0e5346431889a9cea26d2f3d96501111f8d0df75a3ef4b"} Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.277393 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.436170 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtq47\" (UniqueName: \"kubernetes.io/projected/ffd839d1-7464-4060-98c8-b22f706471ff-kube-api-access-jtq47\") pod \"ffd839d1-7464-4060-98c8-b22f706471ff\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.436313 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd839d1-7464-4060-98c8-b22f706471ff-config-volume\") pod \"ffd839d1-7464-4060-98c8-b22f706471ff\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.436403 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffd839d1-7464-4060-98c8-b22f706471ff-secret-volume\") pod \"ffd839d1-7464-4060-98c8-b22f706471ff\" (UID: \"ffd839d1-7464-4060-98c8-b22f706471ff\") " Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.436980 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd839d1-7464-4060-98c8-b22f706471ff-config-volume" (OuterVolumeSpecName: "config-volume") pod "ffd839d1-7464-4060-98c8-b22f706471ff" (UID: "ffd839d1-7464-4060-98c8-b22f706471ff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.442340 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd839d1-7464-4060-98c8-b22f706471ff-kube-api-access-jtq47" (OuterVolumeSpecName: "kube-api-access-jtq47") pod "ffd839d1-7464-4060-98c8-b22f706471ff" (UID: "ffd839d1-7464-4060-98c8-b22f706471ff"). InnerVolumeSpecName "kube-api-access-jtq47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.447384 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd839d1-7464-4060-98c8-b22f706471ff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ffd839d1-7464-4060-98c8-b22f706471ff" (UID: "ffd839d1-7464-4060-98c8-b22f706471ff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.539075 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd839d1-7464-4060-98c8-b22f706471ff-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.539123 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffd839d1-7464-4060-98c8-b22f706471ff-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.539133 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtq47\" (UniqueName: \"kubernetes.io/projected/ffd839d1-7464-4060-98c8-b22f706471ff-kube-api-access-jtq47\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.878675 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" event={"ID":"ffd839d1-7464-4060-98c8-b22f706471ff","Type":"ContainerDied","Data":"0021ef28814558256e0e5346431889a9cea26d2f3d96501111f8d0df75a3ef4b"} Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.879022 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0021ef28814558256e0e5346431889a9cea26d2f3d96501111f8d0df75a3ef4b" Jan 21 16:15:03 crc kubenswrapper[4773]: I0121 16:15:03.878758 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6" Jan 21 16:15:04 crc kubenswrapper[4773]: E0121 16:15:04.102813 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffd839d1_7464_4060_98c8_b22f706471ff.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffd839d1_7464_4060_98c8_b22f706471ff.slice/crio-0021ef28814558256e0e5346431889a9cea26d2f3d96501111f8d0df75a3ef4b\": RecentStats: unable to find data in memory cache]" Jan 21 16:15:04 crc kubenswrapper[4773]: I0121 16:15:04.348842 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh"] Jan 21 16:15:04 crc kubenswrapper[4773]: I0121 16:15:04.358607 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483490-vw8sh"] Jan 21 16:15:05 crc kubenswrapper[4773]: I0121 16:15:05.398443 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1eed7a3-89e0-412b-b931-9e3905a32965" path="/var/lib/kubelet/pods/d1eed7a3-89e0-412b-b931-9e3905a32965/volumes" Jan 21 16:15:11 crc kubenswrapper[4773]: I0121 16:15:11.948291 4773 generic.go:334] "Generic (PLEG): container finished" podID="95dfd2a7-6742-4cd5-8d1e-144e1b176a4c" containerID="3c3199376958955d04a6913dfeb05e85c439e13670d9e7e6d69350a3cc70344c" exitCode=0 Jan 21 16:15:11 crc kubenswrapper[4773]: I0121 16:15:11.948375 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" event={"ID":"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c","Type":"ContainerDied","Data":"3c3199376958955d04a6913dfeb05e85c439e13670d9e7e6d69350a3cc70344c"} Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.494582 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.649683 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-2\") pod \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.649970 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ssh-key-openstack-edpm-ipam\") pod \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.650086 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-0\") pod \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.650164 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-telemetry-combined-ca-bundle\") pod \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.650208 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-inventory\") pod \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.650376 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-1\") pod \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.650415 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfk67\" (UniqueName: \"kubernetes.io/projected/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-kube-api-access-kfk67\") pod \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\" (UID: \"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c\") " Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.657797 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-kube-api-access-kfk67" (OuterVolumeSpecName: "kube-api-access-kfk67") pod "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c" (UID: "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c"). InnerVolumeSpecName "kube-api-access-kfk67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.659911 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c" (UID: "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.689883 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c" (UID: "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.691885 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c" (UID: "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.707635 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c" (UID: "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.710431 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c" (UID: "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.716299 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-inventory" (OuterVolumeSpecName: "inventory") pod "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c" (UID: "95dfd2a7-6742-4cd5-8d1e-144e1b176a4c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.752794 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.752846 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.752856 4773 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.752866 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.752876 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.752885 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfk67\" (UniqueName: \"kubernetes.io/projected/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-kube-api-access-kfk67\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.752893 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/95dfd2a7-6742-4cd5-8d1e-144e1b176a4c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.967187 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" event={"ID":"95dfd2a7-6742-4cd5-8d1e-144e1b176a4c","Type":"ContainerDied","Data":"d5bf4075f555c2e5dfb0d74d6792c5f8f102856b53a071acebd3ed496c568b59"} Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.967225 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5bf4075f555c2e5dfb0d74d6792c5f8f102856b53a071acebd3ed496c568b59" Jan 21 16:15:13 crc kubenswrapper[4773]: I0121 16:15:13.967271 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6" Jan 21 16:15:16 crc kubenswrapper[4773]: I0121 16:15:16.383510 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:15:16 crc kubenswrapper[4773]: E0121 16:15:16.384058 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:15:22 crc kubenswrapper[4773]: I0121 16:15:22.505214 4773 scope.go:117] "RemoveContainer" containerID="cebb40f15a1fe212d9b28102edc3463fd99fdd63727d02c79f72e2271b1a098f" Jan 21 16:15:31 crc kubenswrapper[4773]: I0121 16:15:31.384882 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:15:31 crc kubenswrapper[4773]: E0121 16:15:31.385661 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:15:43 crc kubenswrapper[4773]: I0121 16:15:43.384059 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:15:43 crc kubenswrapper[4773]: E0121 16:15:43.384904 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:15:58 crc kubenswrapper[4773]: I0121 16:15:58.384288 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:15:58 crc kubenswrapper[4773]: E0121 16:15:58.385499 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:16:09 crc kubenswrapper[4773]: I0121 16:16:09.384681 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:16:09 crc kubenswrapper[4773]: E0121 16:16:09.386350 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:16:24 crc kubenswrapper[4773]: I0121 16:16:24.383447 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:16:24 crc kubenswrapper[4773]: E0121 16:16:24.384269 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:16:39 crc kubenswrapper[4773]: I0121 16:16:39.384095 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:16:39 crc kubenswrapper[4773]: E0121 16:16:39.389009 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.446818 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 21 16:16:47 crc kubenswrapper[4773]: E0121 16:16:47.447667 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd839d1-7464-4060-98c8-b22f706471ff" containerName="collect-profiles" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.447681 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd839d1-7464-4060-98c8-b22f706471ff" containerName="collect-profiles" Jan 21 16:16:47 crc kubenswrapper[4773]: E0121 16:16:47.447704 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95dfd2a7-6742-4cd5-8d1e-144e1b176a4c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.447712 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="95dfd2a7-6742-4cd5-8d1e-144e1b176a4c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.447938 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd839d1-7464-4060-98c8-b22f706471ff" containerName="collect-profiles" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.447959 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="95dfd2a7-6742-4cd5-8d1e-144e1b176a4c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.448731 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.451510 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.451774 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.452024 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lbl2p" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.458927 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.463381 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.555093 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr9f4\" (UniqueName: \"kubernetes.io/projected/aee2faa5-89c4-4798-aebe-1d27e3f9861f-kube-api-access-tr9f4\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.555151 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.555234 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-config-data\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.555424 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.555519 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.555594 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.555718 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.555854 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.555925 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.658248 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr9f4\" (UniqueName: \"kubernetes.io/projected/aee2faa5-89c4-4798-aebe-1d27e3f9861f-kube-api-access-tr9f4\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.658328 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.658382 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-config-data\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.658445 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.658482 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.658515 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.658545 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.658603 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.658661 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.659182 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.659468 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.660150 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.660261 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-config-data\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.660445 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.665855 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.674915 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.675159 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.709531 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr9f4\" (UniqueName: \"kubernetes.io/projected/aee2faa5-89c4-4798-aebe-1d27e3f9861f-kube-api-access-tr9f4\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.713118 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " pod="openstack/tempest-tests-tempest" Jan 21 16:16:47 crc kubenswrapper[4773]: I0121 16:16:47.772207 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 21 16:16:48 crc kubenswrapper[4773]: W0121 16:16:48.323260 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee2faa5_89c4_4798_aebe_1d27e3f9861f.slice/crio-5dd5b80e3e027982c347e3097f986207af55b877f777f6a566f186fa4ae6f95a WatchSource:0}: Error finding container 5dd5b80e3e027982c347e3097f986207af55b877f777f6a566f186fa4ae6f95a: Status 404 returned error can't find the container with id 5dd5b80e3e027982c347e3097f986207af55b877f777f6a566f186fa4ae6f95a Jan 21 16:16:48 crc kubenswrapper[4773]: I0121 16:16:48.325253 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:16:48 crc kubenswrapper[4773]: I0121 16:16:48.325308 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 21 16:16:48 crc kubenswrapper[4773]: I0121 16:16:48.843939 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aee2faa5-89c4-4798-aebe-1d27e3f9861f","Type":"ContainerStarted","Data":"5dd5b80e3e027982c347e3097f986207af55b877f777f6a566f186fa4ae6f95a"} Jan 21 16:16:54 crc kubenswrapper[4773]: I0121 16:16:54.384324 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:16:54 crc kubenswrapper[4773]: E0121 16:16:54.385230 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:17:09 crc kubenswrapper[4773]: I0121 16:17:09.384197 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:17:09 crc kubenswrapper[4773]: E0121 16:17:09.385037 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:17:24 crc kubenswrapper[4773]: I0121 16:17:24.383961 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:17:24 crc kubenswrapper[4773]: E0121 16:17:24.388877 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:17:26 crc kubenswrapper[4773]: E0121 16:17:26.803594 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 21 16:17:26 crc kubenswrapper[4773]: E0121 16:17:26.804087 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tr9f4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(aee2faa5-89c4-4798-aebe-1d27e3f9861f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 16:17:26 crc kubenswrapper[4773]: E0121 16:17:26.805271 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="aee2faa5-89c4-4798-aebe-1d27e3f9861f" Jan 21 16:17:27 crc kubenswrapper[4773]: E0121 16:17:27.232448 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="aee2faa5-89c4-4798-aebe-1d27e3f9861f" Jan 21 16:17:35 crc kubenswrapper[4773]: I0121 16:17:35.390489 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:17:35 crc kubenswrapper[4773]: E0121 16:17:35.391355 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:17:43 crc kubenswrapper[4773]: I0121 16:17:43.853951 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 21 16:17:45 crc kubenswrapper[4773]: I0121 16:17:45.405582 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aee2faa5-89c4-4798-aebe-1d27e3f9861f","Type":"ContainerStarted","Data":"b12822ae1b812ed00380a633f9cb37dffd1fe13a6edb659a6975e825efe91264"} Jan 21 16:17:45 crc kubenswrapper[4773]: I0121 16:17:45.435728 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.909856688 podStartE2EDuration="59.435706326s" podCreationTimestamp="2026-01-21 16:16:46 +0000 UTC" firstStartedPulling="2026-01-21 16:16:48.325074668 +0000 UTC m=+3173.249564290" lastFinishedPulling="2026-01-21 16:17:43.850924296 +0000 UTC m=+3228.775413928" observedRunningTime="2026-01-21 16:17:45.425400327 +0000 UTC m=+3230.349889949" watchObservedRunningTime="2026-01-21 16:17:45.435706326 +0000 UTC m=+3230.360195968" Jan 21 16:17:48 crc kubenswrapper[4773]: I0121 16:17:48.384514 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:17:48 crc kubenswrapper[4773]: E0121 16:17:48.385507 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:18:02 crc kubenswrapper[4773]: I0121 16:18:02.384295 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:18:02 crc kubenswrapper[4773]: E0121 16:18:02.385013 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:18:15 crc kubenswrapper[4773]: I0121 16:18:15.392501 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:18:15 crc kubenswrapper[4773]: E0121 16:18:15.393220 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:18:16 crc kubenswrapper[4773]: I0121 16:18:16.826195 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kwkmx"] Jan 21 16:18:16 crc kubenswrapper[4773]: I0121 16:18:16.829016 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:16 crc kubenswrapper[4773]: I0121 16:18:16.847786 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwkmx"] Jan 21 16:18:16 crc kubenswrapper[4773]: I0121 16:18:16.951946 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-catalog-content\") pod \"redhat-operators-kwkmx\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:16 crc kubenswrapper[4773]: I0121 16:18:16.952122 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-utilities\") pod \"redhat-operators-kwkmx\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:16 crc kubenswrapper[4773]: I0121 16:18:16.952217 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x6wf\" (UniqueName: \"kubernetes.io/projected/3eff1a70-323c-4698-88ec-86c8627caab3-kube-api-access-9x6wf\") pod \"redhat-operators-kwkmx\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:17 crc kubenswrapper[4773]: I0121 16:18:17.053623 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-utilities\") pod \"redhat-operators-kwkmx\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:17 crc kubenswrapper[4773]: I0121 16:18:17.053788 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x6wf\" (UniqueName: \"kubernetes.io/projected/3eff1a70-323c-4698-88ec-86c8627caab3-kube-api-access-9x6wf\") pod \"redhat-operators-kwkmx\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:17 crc kubenswrapper[4773]: I0121 16:18:17.053837 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-catalog-content\") pod \"redhat-operators-kwkmx\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:17 crc kubenswrapper[4773]: I0121 16:18:17.054216 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-utilities\") pod \"redhat-operators-kwkmx\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:17 crc kubenswrapper[4773]: I0121 16:18:17.054298 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-catalog-content\") pod \"redhat-operators-kwkmx\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:17 crc kubenswrapper[4773]: I0121 16:18:17.085152 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x6wf\" (UniqueName: \"kubernetes.io/projected/3eff1a70-323c-4698-88ec-86c8627caab3-kube-api-access-9x6wf\") pod \"redhat-operators-kwkmx\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:17 crc kubenswrapper[4773]: I0121 16:18:17.150129 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:17 crc kubenswrapper[4773]: I0121 16:18:17.653766 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwkmx"] Jan 21 16:18:17 crc kubenswrapper[4773]: I0121 16:18:17.701846 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwkmx" event={"ID":"3eff1a70-323c-4698-88ec-86c8627caab3","Type":"ContainerStarted","Data":"a36b0abd19dfb98bcc8e4b803bd711662dee5ba757ca10fed5fe757ad708577f"} Jan 21 16:18:18 crc kubenswrapper[4773]: I0121 16:18:18.712036 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwkmx" event={"ID":"3eff1a70-323c-4698-88ec-86c8627caab3","Type":"ContainerDied","Data":"91cf7329083c360a02c77ec546e71ee82d34e6b1897020dadb24eb4c75bcee53"} Jan 21 16:18:18 crc kubenswrapper[4773]: I0121 16:18:18.712036 4773 generic.go:334] "Generic (PLEG): container finished" podID="3eff1a70-323c-4698-88ec-86c8627caab3" containerID="91cf7329083c360a02c77ec546e71ee82d34e6b1897020dadb24eb4c75bcee53" exitCode=0 Jan 21 16:18:21 crc kubenswrapper[4773]: I0121 16:18:21.742503 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwkmx" event={"ID":"3eff1a70-323c-4698-88ec-86c8627caab3","Type":"ContainerStarted","Data":"37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98"} Jan 21 16:18:23 crc kubenswrapper[4773]: I0121 16:18:23.763858 4773 generic.go:334] "Generic (PLEG): container finished" podID="3eff1a70-323c-4698-88ec-86c8627caab3" containerID="37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98" exitCode=0 Jan 21 16:18:23 crc kubenswrapper[4773]: I0121 16:18:23.763929 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwkmx" event={"ID":"3eff1a70-323c-4698-88ec-86c8627caab3","Type":"ContainerDied","Data":"37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98"} Jan 21 16:18:25 crc kubenswrapper[4773]: I0121 16:18:25.799642 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwkmx" event={"ID":"3eff1a70-323c-4698-88ec-86c8627caab3","Type":"ContainerStarted","Data":"f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e"} Jan 21 16:18:25 crc kubenswrapper[4773]: I0121 16:18:25.818100 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kwkmx" podStartSLOduration=3.900761579 podStartE2EDuration="9.818083862s" podCreationTimestamp="2026-01-21 16:18:16 +0000 UTC" firstStartedPulling="2026-01-21 16:18:18.715221942 +0000 UTC m=+3263.639711564" lastFinishedPulling="2026-01-21 16:18:24.632544225 +0000 UTC m=+3269.557033847" observedRunningTime="2026-01-21 16:18:25.814401172 +0000 UTC m=+3270.738890814" watchObservedRunningTime="2026-01-21 16:18:25.818083862 +0000 UTC m=+3270.742573484" Jan 21 16:18:26 crc kubenswrapper[4773]: I0121 16:18:26.384882 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:18:26 crc kubenswrapper[4773]: E0121 16:18:26.385553 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:18:27 crc kubenswrapper[4773]: I0121 16:18:27.150905 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:27 crc kubenswrapper[4773]: I0121 16:18:27.150964 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:28 crc kubenswrapper[4773]: I0121 16:18:28.205116 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kwkmx" podUID="3eff1a70-323c-4698-88ec-86c8627caab3" containerName="registry-server" probeResult="failure" output=< Jan 21 16:18:28 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 16:18:28 crc kubenswrapper[4773]: > Jan 21 16:18:37 crc kubenswrapper[4773]: I0121 16:18:37.200189 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:37 crc kubenswrapper[4773]: I0121 16:18:37.251787 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:37 crc kubenswrapper[4773]: I0121 16:18:37.435349 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwkmx"] Jan 21 16:18:38 crc kubenswrapper[4773]: I0121 16:18:38.938425 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kwkmx" podUID="3eff1a70-323c-4698-88ec-86c8627caab3" containerName="registry-server" containerID="cri-o://f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e" gracePeriod=2 Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.623403 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.752512 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-catalog-content\") pod \"3eff1a70-323c-4698-88ec-86c8627caab3\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.753032 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x6wf\" (UniqueName: \"kubernetes.io/projected/3eff1a70-323c-4698-88ec-86c8627caab3-kube-api-access-9x6wf\") pod \"3eff1a70-323c-4698-88ec-86c8627caab3\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.753140 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-utilities\") pod \"3eff1a70-323c-4698-88ec-86c8627caab3\" (UID: \"3eff1a70-323c-4698-88ec-86c8627caab3\") " Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.754102 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-utilities" (OuterVolumeSpecName: "utilities") pod "3eff1a70-323c-4698-88ec-86c8627caab3" (UID: "3eff1a70-323c-4698-88ec-86c8627caab3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.761510 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eff1a70-323c-4698-88ec-86c8627caab3-kube-api-access-9x6wf" (OuterVolumeSpecName: "kube-api-access-9x6wf") pod "3eff1a70-323c-4698-88ec-86c8627caab3" (UID: "3eff1a70-323c-4698-88ec-86c8627caab3"). InnerVolumeSpecName "kube-api-access-9x6wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.856020 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x6wf\" (UniqueName: \"kubernetes.io/projected/3eff1a70-323c-4698-88ec-86c8627caab3-kube-api-access-9x6wf\") on node \"crc\" DevicePath \"\"" Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.856256 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.866138 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3eff1a70-323c-4698-88ec-86c8627caab3" (UID: "3eff1a70-323c-4698-88ec-86c8627caab3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.951606 4773 generic.go:334] "Generic (PLEG): container finished" podID="3eff1a70-323c-4698-88ec-86c8627caab3" containerID="f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e" exitCode=0 Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.951657 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwkmx" event={"ID":"3eff1a70-323c-4698-88ec-86c8627caab3","Type":"ContainerDied","Data":"f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e"} Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.951691 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwkmx" event={"ID":"3eff1a70-323c-4698-88ec-86c8627caab3","Type":"ContainerDied","Data":"a36b0abd19dfb98bcc8e4b803bd711662dee5ba757ca10fed5fe757ad708577f"} Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.951661 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwkmx" Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.951723 4773 scope.go:117] "RemoveContainer" containerID="f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e" Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.986683 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eff1a70-323c-4698-88ec-86c8627caab3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:18:39 crc kubenswrapper[4773]: I0121 16:18:39.989778 4773 scope.go:117] "RemoveContainer" containerID="37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98" Jan 21 16:18:40 crc kubenswrapper[4773]: I0121 16:18:40.005812 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwkmx"] Jan 21 16:18:40 crc kubenswrapper[4773]: I0121 16:18:40.013262 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kwkmx"] Jan 21 16:18:40 crc kubenswrapper[4773]: I0121 16:18:40.017200 4773 scope.go:117] "RemoveContainer" containerID="91cf7329083c360a02c77ec546e71ee82d34e6b1897020dadb24eb4c75bcee53" Jan 21 16:18:40 crc kubenswrapper[4773]: I0121 16:18:40.065068 4773 scope.go:117] "RemoveContainer" containerID="f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e" Jan 21 16:18:40 crc kubenswrapper[4773]: E0121 16:18:40.066304 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e\": container with ID starting with f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e not found: ID does not exist" containerID="f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e" Jan 21 16:18:40 crc kubenswrapper[4773]: I0121 16:18:40.066349 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e"} err="failed to get container status \"f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e\": rpc error: code = NotFound desc = could not find container \"f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e\": container with ID starting with f9adf755366fb2828801db6f0cd6280f28dd5128ccfcd63ede8e8d8ef683005e not found: ID does not exist" Jan 21 16:18:40 crc kubenswrapper[4773]: I0121 16:18:40.066376 4773 scope.go:117] "RemoveContainer" containerID="37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98" Jan 21 16:18:40 crc kubenswrapper[4773]: E0121 16:18:40.066941 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98\": container with ID starting with 37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98 not found: ID does not exist" containerID="37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98" Jan 21 16:18:40 crc kubenswrapper[4773]: I0121 16:18:40.066985 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98"} err="failed to get container status \"37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98\": rpc error: code = NotFound desc = could not find container \"37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98\": container with ID starting with 37c326a4f2c17ea1967b17d4fbcca3c5651f2b85c07276319b3a0298f2ee5b98 not found: ID does not exist" Jan 21 16:18:40 crc kubenswrapper[4773]: I0121 16:18:40.067063 4773 scope.go:117] "RemoveContainer" containerID="91cf7329083c360a02c77ec546e71ee82d34e6b1897020dadb24eb4c75bcee53" Jan 21 16:18:40 crc kubenswrapper[4773]: E0121 16:18:40.067475 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91cf7329083c360a02c77ec546e71ee82d34e6b1897020dadb24eb4c75bcee53\": container with ID starting with 91cf7329083c360a02c77ec546e71ee82d34e6b1897020dadb24eb4c75bcee53 not found: ID does not exist" containerID="91cf7329083c360a02c77ec546e71ee82d34e6b1897020dadb24eb4c75bcee53" Jan 21 16:18:40 crc kubenswrapper[4773]: I0121 16:18:40.067529 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91cf7329083c360a02c77ec546e71ee82d34e6b1897020dadb24eb4c75bcee53"} err="failed to get container status \"91cf7329083c360a02c77ec546e71ee82d34e6b1897020dadb24eb4c75bcee53\": rpc error: code = NotFound desc = could not find container \"91cf7329083c360a02c77ec546e71ee82d34e6b1897020dadb24eb4c75bcee53\": container with ID starting with 91cf7329083c360a02c77ec546e71ee82d34e6b1897020dadb24eb4c75bcee53 not found: ID does not exist" Jan 21 16:18:40 crc kubenswrapper[4773]: I0121 16:18:40.384456 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:18:40 crc kubenswrapper[4773]: E0121 16:18:40.385294 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:18:41 crc kubenswrapper[4773]: I0121 16:18:41.395485 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eff1a70-323c-4698-88ec-86c8627caab3" path="/var/lib/kubelet/pods/3eff1a70-323c-4698-88ec-86c8627caab3/volumes" Jan 21 16:18:55 crc kubenswrapper[4773]: I0121 16:18:55.393739 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:18:56 crc kubenswrapper[4773]: I0121 16:18:56.172548 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"b6c1e82b06a68e14f8e95678571e3110eb2980afa4f616516fa87912eb096db9"} Jan 21 16:20:55 crc kubenswrapper[4773]: I0121 16:20:55.206424 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:20:55 crc kubenswrapper[4773]: I0121 16:20:55.207011 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:21:25 crc kubenswrapper[4773]: I0121 16:21:25.206142 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:21:25 crc kubenswrapper[4773]: I0121 16:21:25.206725 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:21:55 crc kubenswrapper[4773]: I0121 16:21:55.205583 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:21:55 crc kubenswrapper[4773]: I0121 16:21:55.206112 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:21:55 crc kubenswrapper[4773]: I0121 16:21:55.206167 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 16:21:55 crc kubenswrapper[4773]: I0121 16:21:55.207160 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6c1e82b06a68e14f8e95678571e3110eb2980afa4f616516fa87912eb096db9"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:21:55 crc kubenswrapper[4773]: I0121 16:21:55.207225 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://b6c1e82b06a68e14f8e95678571e3110eb2980afa4f616516fa87912eb096db9" gracePeriod=600 Jan 21 16:21:55 crc kubenswrapper[4773]: I0121 16:21:55.922842 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="b6c1e82b06a68e14f8e95678571e3110eb2980afa4f616516fa87912eb096db9" exitCode=0 Jan 21 16:21:55 crc kubenswrapper[4773]: I0121 16:21:55.922921 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"b6c1e82b06a68e14f8e95678571e3110eb2980afa4f616516fa87912eb096db9"} Jan 21 16:21:55 crc kubenswrapper[4773]: I0121 16:21:55.923191 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df"} Jan 21 16:21:55 crc kubenswrapper[4773]: I0121 16:21:55.923218 4773 scope.go:117] "RemoveContainer" containerID="cb64e2a62c22a72f632dc47c23b8ea54a01a5abdb5e2de8168c2b2a79437b89d" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.663314 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5q65q"] Jan 21 16:21:56 crc kubenswrapper[4773]: E0121 16:21:56.665246 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eff1a70-323c-4698-88ec-86c8627caab3" containerName="extract-utilities" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.665283 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eff1a70-323c-4698-88ec-86c8627caab3" containerName="extract-utilities" Jan 21 16:21:56 crc kubenswrapper[4773]: E0121 16:21:56.665308 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eff1a70-323c-4698-88ec-86c8627caab3" containerName="registry-server" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.665319 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eff1a70-323c-4698-88ec-86c8627caab3" containerName="registry-server" Jan 21 16:21:56 crc kubenswrapper[4773]: E0121 16:21:56.665363 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eff1a70-323c-4698-88ec-86c8627caab3" containerName="extract-content" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.665374 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eff1a70-323c-4698-88ec-86c8627caab3" containerName="extract-content" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.667914 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eff1a70-323c-4698-88ec-86c8627caab3" containerName="registry-server" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.672743 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.681497 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5q65q"] Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.719108 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-catalog-content\") pod \"certified-operators-5q65q\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.719160 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-utilities\") pod \"certified-operators-5q65q\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.719329 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fgtl\" (UniqueName: \"kubernetes.io/projected/14237906-dc49-487f-9e22-49d705bd21e9-kube-api-access-6fgtl\") pod \"certified-operators-5q65q\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.820903 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fgtl\" (UniqueName: \"kubernetes.io/projected/14237906-dc49-487f-9e22-49d705bd21e9-kube-api-access-6fgtl\") pod \"certified-operators-5q65q\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.821037 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-catalog-content\") pod \"certified-operators-5q65q\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.821069 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-utilities\") pod \"certified-operators-5q65q\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.821555 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-utilities\") pod \"certified-operators-5q65q\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.821557 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-catalog-content\") pod \"certified-operators-5q65q\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:21:56 crc kubenswrapper[4773]: I0121 16:21:56.840363 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fgtl\" (UniqueName: \"kubernetes.io/projected/14237906-dc49-487f-9e22-49d705bd21e9-kube-api-access-6fgtl\") pod \"certified-operators-5q65q\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:21:57 crc kubenswrapper[4773]: I0121 16:21:57.005935 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:21:57 crc kubenswrapper[4773]: I0121 16:21:57.675237 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5q65q"] Jan 21 16:21:57 crc kubenswrapper[4773]: W0121 16:21:57.690784 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14237906_dc49_487f_9e22_49d705bd21e9.slice/crio-b181ca0fb7b0087a05f316093f1e6786cbca5c9609d3b45506c48ea74c5d42eb WatchSource:0}: Error finding container b181ca0fb7b0087a05f316093f1e6786cbca5c9609d3b45506c48ea74c5d42eb: Status 404 returned error can't find the container with id b181ca0fb7b0087a05f316093f1e6786cbca5c9609d3b45506c48ea74c5d42eb Jan 21 16:21:57 crc kubenswrapper[4773]: I0121 16:21:57.957082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q65q" event={"ID":"14237906-dc49-487f-9e22-49d705bd21e9","Type":"ContainerStarted","Data":"b181ca0fb7b0087a05f316093f1e6786cbca5c9609d3b45506c48ea74c5d42eb"} Jan 21 16:21:58 crc kubenswrapper[4773]: I0121 16:21:58.967764 4773 generic.go:334] "Generic (PLEG): container finished" podID="14237906-dc49-487f-9e22-49d705bd21e9" containerID="5d6ad63c96d53f5b71dba98d7b3bef399436b8c0bf47a0206f125cde9600f4f2" exitCode=0 Jan 21 16:21:58 crc kubenswrapper[4773]: I0121 16:21:58.967870 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q65q" event={"ID":"14237906-dc49-487f-9e22-49d705bd21e9","Type":"ContainerDied","Data":"5d6ad63c96d53f5b71dba98d7b3bef399436b8c0bf47a0206f125cde9600f4f2"} Jan 21 16:21:58 crc kubenswrapper[4773]: I0121 16:21:58.970236 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:21:59 crc kubenswrapper[4773]: I0121 16:21:59.978994 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q65q" event={"ID":"14237906-dc49-487f-9e22-49d705bd21e9","Type":"ContainerStarted","Data":"9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10"} Jan 21 16:22:02 crc kubenswrapper[4773]: I0121 16:22:02.000444 4773 generic.go:334] "Generic (PLEG): container finished" podID="14237906-dc49-487f-9e22-49d705bd21e9" containerID="9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10" exitCode=0 Jan 21 16:22:02 crc kubenswrapper[4773]: I0121 16:22:02.000565 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q65q" event={"ID":"14237906-dc49-487f-9e22-49d705bd21e9","Type":"ContainerDied","Data":"9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10"} Jan 21 16:22:03 crc kubenswrapper[4773]: I0121 16:22:03.012888 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q65q" event={"ID":"14237906-dc49-487f-9e22-49d705bd21e9","Type":"ContainerStarted","Data":"ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e"} Jan 21 16:22:03 crc kubenswrapper[4773]: I0121 16:22:03.032927 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5q65q" podStartSLOduration=3.593290592 podStartE2EDuration="7.03290871s" podCreationTimestamp="2026-01-21 16:21:56 +0000 UTC" firstStartedPulling="2026-01-21 16:21:58.969963806 +0000 UTC m=+3483.894453418" lastFinishedPulling="2026-01-21 16:22:02.409581914 +0000 UTC m=+3487.334071536" observedRunningTime="2026-01-21 16:22:03.031541043 +0000 UTC m=+3487.956030665" watchObservedRunningTime="2026-01-21 16:22:03.03290871 +0000 UTC m=+3487.957398342" Jan 21 16:22:07 crc kubenswrapper[4773]: I0121 16:22:07.007553 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:22:07 crc kubenswrapper[4773]: I0121 16:22:07.009058 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:22:07 crc kubenswrapper[4773]: I0121 16:22:07.059396 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:22:08 crc kubenswrapper[4773]: I0121 16:22:08.100191 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:22:08 crc kubenswrapper[4773]: I0121 16:22:08.163860 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5q65q"] Jan 21 16:22:10 crc kubenswrapper[4773]: I0121 16:22:10.076472 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5q65q" podUID="14237906-dc49-487f-9e22-49d705bd21e9" containerName="registry-server" containerID="cri-o://ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e" gracePeriod=2 Jan 21 16:22:10 crc kubenswrapper[4773]: I0121 16:22:10.807325 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:22:10 crc kubenswrapper[4773]: I0121 16:22:10.942608 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-catalog-content\") pod \"14237906-dc49-487f-9e22-49d705bd21e9\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " Jan 21 16:22:10 crc kubenswrapper[4773]: I0121 16:22:10.942653 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fgtl\" (UniqueName: \"kubernetes.io/projected/14237906-dc49-487f-9e22-49d705bd21e9-kube-api-access-6fgtl\") pod \"14237906-dc49-487f-9e22-49d705bd21e9\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " Jan 21 16:22:10 crc kubenswrapper[4773]: I0121 16:22:10.942672 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-utilities\") pod \"14237906-dc49-487f-9e22-49d705bd21e9\" (UID: \"14237906-dc49-487f-9e22-49d705bd21e9\") " Jan 21 16:22:10 crc kubenswrapper[4773]: I0121 16:22:10.943773 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-utilities" (OuterVolumeSpecName: "utilities") pod "14237906-dc49-487f-9e22-49d705bd21e9" (UID: "14237906-dc49-487f-9e22-49d705bd21e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:22:10 crc kubenswrapper[4773]: I0121 16:22:10.948960 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14237906-dc49-487f-9e22-49d705bd21e9-kube-api-access-6fgtl" (OuterVolumeSpecName: "kube-api-access-6fgtl") pod "14237906-dc49-487f-9e22-49d705bd21e9" (UID: "14237906-dc49-487f-9e22-49d705bd21e9"). InnerVolumeSpecName "kube-api-access-6fgtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:22:10 crc kubenswrapper[4773]: I0121 16:22:10.995436 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14237906-dc49-487f-9e22-49d705bd21e9" (UID: "14237906-dc49-487f-9e22-49d705bd21e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.044823 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.044854 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fgtl\" (UniqueName: \"kubernetes.io/projected/14237906-dc49-487f-9e22-49d705bd21e9-kube-api-access-6fgtl\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.044868 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14237906-dc49-487f-9e22-49d705bd21e9-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.088350 4773 generic.go:334] "Generic (PLEG): container finished" podID="14237906-dc49-487f-9e22-49d705bd21e9" containerID="ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e" exitCode=0 Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.088397 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q65q" event={"ID":"14237906-dc49-487f-9e22-49d705bd21e9","Type":"ContainerDied","Data":"ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e"} Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.088437 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q65q" event={"ID":"14237906-dc49-487f-9e22-49d705bd21e9","Type":"ContainerDied","Data":"b181ca0fb7b0087a05f316093f1e6786cbca5c9609d3b45506c48ea74c5d42eb"} Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.088458 4773 scope.go:117] "RemoveContainer" containerID="ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.089424 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q65q" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.107828 4773 scope.go:117] "RemoveContainer" containerID="9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.123997 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5q65q"] Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.133721 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5q65q"] Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.154332 4773 scope.go:117] "RemoveContainer" containerID="5d6ad63c96d53f5b71dba98d7b3bef399436b8c0bf47a0206f125cde9600f4f2" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.190548 4773 scope.go:117] "RemoveContainer" containerID="ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e" Jan 21 16:22:11 crc kubenswrapper[4773]: E0121 16:22:11.191133 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e\": container with ID starting with ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e not found: ID does not exist" containerID="ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.191171 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e"} err="failed to get container status \"ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e\": rpc error: code = NotFound desc = could not find container \"ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e\": container with ID starting with ec9ef3bc6bd4edbbfdbbe1927ff03c5ad28263b18d006abfbc0a78c2d151e24e not found: ID does not exist" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.191197 4773 scope.go:117] "RemoveContainer" containerID="9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10" Jan 21 16:22:11 crc kubenswrapper[4773]: E0121 16:22:11.191998 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10\": container with ID starting with 9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10 not found: ID does not exist" containerID="9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.192053 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10"} err="failed to get container status \"9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10\": rpc error: code = NotFound desc = could not find container \"9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10\": container with ID starting with 9afa8db52cde999ca49c9ec9b7bc3029c183b602ae53940509aade81454b7a10 not found: ID does not exist" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.192092 4773 scope.go:117] "RemoveContainer" containerID="5d6ad63c96d53f5b71dba98d7b3bef399436b8c0bf47a0206f125cde9600f4f2" Jan 21 16:22:11 crc kubenswrapper[4773]: E0121 16:22:11.192507 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6ad63c96d53f5b71dba98d7b3bef399436b8c0bf47a0206f125cde9600f4f2\": container with ID starting with 5d6ad63c96d53f5b71dba98d7b3bef399436b8c0bf47a0206f125cde9600f4f2 not found: ID does not exist" containerID="5d6ad63c96d53f5b71dba98d7b3bef399436b8c0bf47a0206f125cde9600f4f2" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.192576 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6ad63c96d53f5b71dba98d7b3bef399436b8c0bf47a0206f125cde9600f4f2"} err="failed to get container status \"5d6ad63c96d53f5b71dba98d7b3bef399436b8c0bf47a0206f125cde9600f4f2\": rpc error: code = NotFound desc = could not find container \"5d6ad63c96d53f5b71dba98d7b3bef399436b8c0bf47a0206f125cde9600f4f2\": container with ID starting with 5d6ad63c96d53f5b71dba98d7b3bef399436b8c0bf47a0206f125cde9600f4f2 not found: ID does not exist" Jan 21 16:22:11 crc kubenswrapper[4773]: I0121 16:22:11.402798 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14237906-dc49-487f-9e22-49d705bd21e9" path="/var/lib/kubelet/pods/14237906-dc49-487f-9e22-49d705bd21e9/volumes" Jan 21 16:23:08 crc kubenswrapper[4773]: I0121 16:23:08.876822 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pr8hm"] Jan 21 16:23:08 crc kubenswrapper[4773]: E0121 16:23:08.877897 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14237906-dc49-487f-9e22-49d705bd21e9" containerName="extract-utilities" Jan 21 16:23:08 crc kubenswrapper[4773]: I0121 16:23:08.877915 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="14237906-dc49-487f-9e22-49d705bd21e9" containerName="extract-utilities" Jan 21 16:23:08 crc kubenswrapper[4773]: E0121 16:23:08.877933 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14237906-dc49-487f-9e22-49d705bd21e9" containerName="registry-server" Jan 21 16:23:08 crc kubenswrapper[4773]: I0121 16:23:08.877942 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="14237906-dc49-487f-9e22-49d705bd21e9" containerName="registry-server" Jan 21 16:23:08 crc kubenswrapper[4773]: E0121 16:23:08.877959 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14237906-dc49-487f-9e22-49d705bd21e9" containerName="extract-content" Jan 21 16:23:08 crc kubenswrapper[4773]: I0121 16:23:08.877966 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="14237906-dc49-487f-9e22-49d705bd21e9" containerName="extract-content" Jan 21 16:23:08 crc kubenswrapper[4773]: I0121 16:23:08.878209 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="14237906-dc49-487f-9e22-49d705bd21e9" containerName="registry-server" Jan 21 16:23:08 crc kubenswrapper[4773]: I0121 16:23:08.880232 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:08 crc kubenswrapper[4773]: I0121 16:23:08.891550 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pr8hm"] Jan 21 16:23:08 crc kubenswrapper[4773]: I0121 16:23:08.905048 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x6v4\" (UniqueName: \"kubernetes.io/projected/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-kube-api-access-7x6v4\") pod \"community-operators-pr8hm\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:08 crc kubenswrapper[4773]: I0121 16:23:08.905119 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-catalog-content\") pod \"community-operators-pr8hm\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:08 crc kubenswrapper[4773]: I0121 16:23:08.905147 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-utilities\") pod \"community-operators-pr8hm\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:09 crc kubenswrapper[4773]: I0121 16:23:09.007189 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x6v4\" (UniqueName: \"kubernetes.io/projected/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-kube-api-access-7x6v4\") pod \"community-operators-pr8hm\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:09 crc kubenswrapper[4773]: I0121 16:23:09.007290 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-catalog-content\") pod \"community-operators-pr8hm\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:09 crc kubenswrapper[4773]: I0121 16:23:09.007337 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-utilities\") pod \"community-operators-pr8hm\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:09 crc kubenswrapper[4773]: I0121 16:23:09.007920 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-utilities\") pod \"community-operators-pr8hm\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:09 crc kubenswrapper[4773]: I0121 16:23:09.008045 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-catalog-content\") pod \"community-operators-pr8hm\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:09 crc kubenswrapper[4773]: I0121 16:23:09.046791 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x6v4\" (UniqueName: \"kubernetes.io/projected/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-kube-api-access-7x6v4\") pod \"community-operators-pr8hm\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:09 crc kubenswrapper[4773]: I0121 16:23:09.204105 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:09 crc kubenswrapper[4773]: I0121 16:23:09.792580 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pr8hm"] Jan 21 16:23:10 crc kubenswrapper[4773]: I0121 16:23:10.660562 4773 generic.go:334] "Generic (PLEG): container finished" podID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" containerID="48247ba4a1fb538fb4389e3a7e9e603f543ccd32142b93502dd56c224b18c17f" exitCode=0 Jan 21 16:23:10 crc kubenswrapper[4773]: I0121 16:23:10.660831 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8hm" event={"ID":"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4","Type":"ContainerDied","Data":"48247ba4a1fb538fb4389e3a7e9e603f543ccd32142b93502dd56c224b18c17f"} Jan 21 16:23:10 crc kubenswrapper[4773]: I0121 16:23:10.660880 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8hm" event={"ID":"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4","Type":"ContainerStarted","Data":"40fb408342cbc8e89298ddeb098f9a26983fd6bc94a79d43019d5249e4b5612a"} Jan 21 16:23:16 crc kubenswrapper[4773]: I0121 16:23:16.711194 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8hm" event={"ID":"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4","Type":"ContainerStarted","Data":"7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26"} Jan 21 16:23:18 crc kubenswrapper[4773]: I0121 16:23:18.729066 4773 generic.go:334] "Generic (PLEG): container finished" podID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" containerID="7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26" exitCode=0 Jan 21 16:23:18 crc kubenswrapper[4773]: I0121 16:23:18.729136 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8hm" event={"ID":"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4","Type":"ContainerDied","Data":"7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26"} Jan 21 16:23:20 crc kubenswrapper[4773]: I0121 16:23:20.752635 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8hm" event={"ID":"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4","Type":"ContainerStarted","Data":"0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca"} Jan 21 16:23:20 crc kubenswrapper[4773]: I0121 16:23:20.778551 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pr8hm" podStartSLOduration=3.303906641 podStartE2EDuration="12.778528862s" podCreationTimestamp="2026-01-21 16:23:08 +0000 UTC" firstStartedPulling="2026-01-21 16:23:10.662181127 +0000 UTC m=+3555.586670749" lastFinishedPulling="2026-01-21 16:23:20.136803348 +0000 UTC m=+3565.061292970" observedRunningTime="2026-01-21 16:23:20.775253374 +0000 UTC m=+3565.699743006" watchObservedRunningTime="2026-01-21 16:23:20.778528862 +0000 UTC m=+3565.703018484" Jan 21 16:23:29 crc kubenswrapper[4773]: I0121 16:23:29.205089 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:29 crc kubenswrapper[4773]: I0121 16:23:29.205764 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:29 crc kubenswrapper[4773]: I0121 16:23:29.263605 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:29 crc kubenswrapper[4773]: I0121 16:23:29.838516 4773 generic.go:334] "Generic (PLEG): container finished" podID="aee2faa5-89c4-4798-aebe-1d27e3f9861f" containerID="b12822ae1b812ed00380a633f9cb37dffd1fe13a6edb659a6975e825efe91264" exitCode=0 Jan 21 16:23:29 crc kubenswrapper[4773]: I0121 16:23:29.838921 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aee2faa5-89c4-4798-aebe-1d27e3f9861f","Type":"ContainerDied","Data":"b12822ae1b812ed00380a633f9cb37dffd1fe13a6edb659a6975e825efe91264"} Jan 21 16:23:29 crc kubenswrapper[4773]: I0121 16:23:29.889188 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:29 crc kubenswrapper[4773]: I0121 16:23:29.956644 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pr8hm"] Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.456536 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.591601 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr9f4\" (UniqueName: \"kubernetes.io/projected/aee2faa5-89c4-4798-aebe-1d27e3f9861f-kube-api-access-tr9f4\") pod \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.591653 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-temporary\") pod \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.591840 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-config-data\") pod \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.591938 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config\") pod \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.591965 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config-secret\") pod \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.592001 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ca-certs\") pod \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.592045 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-workdir\") pod \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.592119 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.592146 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ssh-key\") pod \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\" (UID: \"aee2faa5-89c4-4798-aebe-1d27e3f9861f\") " Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.592313 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "aee2faa5-89c4-4798-aebe-1d27e3f9861f" (UID: "aee2faa5-89c4-4798-aebe-1d27e3f9861f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.592480 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-config-data" (OuterVolumeSpecName: "config-data") pod "aee2faa5-89c4-4798-aebe-1d27e3f9861f" (UID: "aee2faa5-89c4-4798-aebe-1d27e3f9861f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.592816 4773 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.592855 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.603888 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "aee2faa5-89c4-4798-aebe-1d27e3f9861f" (UID: "aee2faa5-89c4-4798-aebe-1d27e3f9861f"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.603980 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee2faa5-89c4-4798-aebe-1d27e3f9861f-kube-api-access-tr9f4" (OuterVolumeSpecName: "kube-api-access-tr9f4") pod "aee2faa5-89c4-4798-aebe-1d27e3f9861f" (UID: "aee2faa5-89c4-4798-aebe-1d27e3f9861f"). InnerVolumeSpecName "kube-api-access-tr9f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.629639 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "aee2faa5-89c4-4798-aebe-1d27e3f9861f" (UID: "aee2faa5-89c4-4798-aebe-1d27e3f9861f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.640480 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aee2faa5-89c4-4798-aebe-1d27e3f9861f" (UID: "aee2faa5-89c4-4798-aebe-1d27e3f9861f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.654912 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "aee2faa5-89c4-4798-aebe-1d27e3f9861f" (UID: "aee2faa5-89c4-4798-aebe-1d27e3f9861f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.657647 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "aee2faa5-89c4-4798-aebe-1d27e3f9861f" (UID: "aee2faa5-89c4-4798-aebe-1d27e3f9861f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.694291 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.694324 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.694335 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr9f4\" (UniqueName: \"kubernetes.io/projected/aee2faa5-89c4-4798-aebe-1d27e3f9861f-kube-api-access-tr9f4\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.694345 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.694355 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.694363 4773 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aee2faa5-89c4-4798-aebe-1d27e3f9861f-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.716013 4773 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.796882 4773 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.858085 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aee2faa5-89c4-4798-aebe-1d27e3f9861f","Type":"ContainerDied","Data":"5dd5b80e3e027982c347e3097f986207af55b877f777f6a566f186fa4ae6f95a"} Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.858521 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dd5b80e3e027982c347e3097f986207af55b877f777f6a566f186fa4ae6f95a" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.858230 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pr8hm" podUID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" containerName="registry-server" containerID="cri-o://0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca" gracePeriod=2 Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.858137 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 21 16:23:31 crc kubenswrapper[4773]: I0121 16:23:31.965711 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "aee2faa5-89c4-4798-aebe-1d27e3f9861f" (UID: "aee2faa5-89c4-4798-aebe-1d27e3f9861f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.000425 4773 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aee2faa5-89c4-4798-aebe-1d27e3f9861f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.616826 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.713863 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-catalog-content\") pod \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.714067 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-utilities\") pod \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.714184 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x6v4\" (UniqueName: \"kubernetes.io/projected/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-kube-api-access-7x6v4\") pod \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\" (UID: \"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4\") " Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.723303 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-utilities" (OuterVolumeSpecName: "utilities") pod "d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" (UID: "d5e535eb-0417-4c7d-bcbb-52e69f0b90e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.727252 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-kube-api-access-7x6v4" (OuterVolumeSpecName: "kube-api-access-7x6v4") pod "d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" (UID: "d5e535eb-0417-4c7d-bcbb-52e69f0b90e4"). InnerVolumeSpecName "kube-api-access-7x6v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.800979 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" (UID: "d5e535eb-0417-4c7d-bcbb-52e69f0b90e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.816584 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.816624 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x6v4\" (UniqueName: \"kubernetes.io/projected/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-kube-api-access-7x6v4\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.816634 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.869797 4773 generic.go:334] "Generic (PLEG): container finished" podID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" containerID="0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca" exitCode=0 Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.869849 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pr8hm" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.869853 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8hm" event={"ID":"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4","Type":"ContainerDied","Data":"0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca"} Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.869997 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8hm" event={"ID":"d5e535eb-0417-4c7d-bcbb-52e69f0b90e4","Type":"ContainerDied","Data":"40fb408342cbc8e89298ddeb098f9a26983fd6bc94a79d43019d5249e4b5612a"} Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.870041 4773 scope.go:117] "RemoveContainer" containerID="0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.941851 4773 scope.go:117] "RemoveContainer" containerID="7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26" Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.953819 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pr8hm"] Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.973129 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pr8hm"] Jan 21 16:23:32 crc kubenswrapper[4773]: I0121 16:23:32.976800 4773 scope.go:117] "RemoveContainer" containerID="48247ba4a1fb538fb4389e3a7e9e603f543ccd32142b93502dd56c224b18c17f" Jan 21 16:23:33 crc kubenswrapper[4773]: I0121 16:23:33.023317 4773 scope.go:117] "RemoveContainer" containerID="0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca" Jan 21 16:23:33 crc kubenswrapper[4773]: E0121 16:23:33.023836 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca\": container with ID starting with 0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca not found: ID does not exist" containerID="0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca" Jan 21 16:23:33 crc kubenswrapper[4773]: I0121 16:23:33.023864 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca"} err="failed to get container status \"0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca\": rpc error: code = NotFound desc = could not find container \"0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca\": container with ID starting with 0648dffcc4b7eb7f9fc35a6f73d7b05bdada83e2cdbdf22621c276c92b69c9ca not found: ID does not exist" Jan 21 16:23:33 crc kubenswrapper[4773]: I0121 16:23:33.023885 4773 scope.go:117] "RemoveContainer" containerID="7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26" Jan 21 16:23:33 crc kubenswrapper[4773]: E0121 16:23:33.024160 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26\": container with ID starting with 7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26 not found: ID does not exist" containerID="7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26" Jan 21 16:23:33 crc kubenswrapper[4773]: I0121 16:23:33.024201 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26"} err="failed to get container status \"7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26\": rpc error: code = NotFound desc = could not find container \"7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26\": container with ID starting with 7e7fce4b06fe6ffca703ae67e073e739bcf27b46b0483899ff43e41a55ae8f26 not found: ID does not exist" Jan 21 16:23:33 crc kubenswrapper[4773]: I0121 16:23:33.024230 4773 scope.go:117] "RemoveContainer" containerID="48247ba4a1fb538fb4389e3a7e9e603f543ccd32142b93502dd56c224b18c17f" Jan 21 16:23:33 crc kubenswrapper[4773]: E0121 16:23:33.024567 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48247ba4a1fb538fb4389e3a7e9e603f543ccd32142b93502dd56c224b18c17f\": container with ID starting with 48247ba4a1fb538fb4389e3a7e9e603f543ccd32142b93502dd56c224b18c17f not found: ID does not exist" containerID="48247ba4a1fb538fb4389e3a7e9e603f543ccd32142b93502dd56c224b18c17f" Jan 21 16:23:33 crc kubenswrapper[4773]: I0121 16:23:33.024596 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48247ba4a1fb538fb4389e3a7e9e603f543ccd32142b93502dd56c224b18c17f"} err="failed to get container status \"48247ba4a1fb538fb4389e3a7e9e603f543ccd32142b93502dd56c224b18c17f\": rpc error: code = NotFound desc = could not find container \"48247ba4a1fb538fb4389e3a7e9e603f543ccd32142b93502dd56c224b18c17f\": container with ID starting with 48247ba4a1fb538fb4389e3a7e9e603f543ccd32142b93502dd56c224b18c17f not found: ID does not exist" Jan 21 16:23:33 crc kubenswrapper[4773]: I0121 16:23:33.395512 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" path="/var/lib/kubelet/pods/d5e535eb-0417-4c7d-bcbb-52e69f0b90e4/volumes" Jan 21 16:23:38 crc kubenswrapper[4773]: I0121 16:23:38.897862 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 21 16:23:38 crc kubenswrapper[4773]: E0121 16:23:38.899743 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" containerName="extract-content" Jan 21 16:23:38 crc kubenswrapper[4773]: I0121 16:23:38.899770 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" containerName="extract-content" Jan 21 16:23:38 crc kubenswrapper[4773]: E0121 16:23:38.899788 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" containerName="extract-utilities" Jan 21 16:23:38 crc kubenswrapper[4773]: I0121 16:23:38.899799 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" containerName="extract-utilities" Jan 21 16:23:38 crc kubenswrapper[4773]: E0121 16:23:38.899835 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee2faa5-89c4-4798-aebe-1d27e3f9861f" containerName="tempest-tests-tempest-tests-runner" Jan 21 16:23:38 crc kubenswrapper[4773]: I0121 16:23:38.899844 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee2faa5-89c4-4798-aebe-1d27e3f9861f" containerName="tempest-tests-tempest-tests-runner" Jan 21 16:23:38 crc kubenswrapper[4773]: E0121 16:23:38.899881 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" containerName="registry-server" Jan 21 16:23:38 crc kubenswrapper[4773]: I0121 16:23:38.899889 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" containerName="registry-server" Jan 21 16:23:38 crc kubenswrapper[4773]: I0121 16:23:38.900107 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee2faa5-89c4-4798-aebe-1d27e3f9861f" containerName="tempest-tests-tempest-tests-runner" Jan 21 16:23:38 crc kubenswrapper[4773]: I0121 16:23:38.900148 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e535eb-0417-4c7d-bcbb-52e69f0b90e4" containerName="registry-server" Jan 21 16:23:38 crc kubenswrapper[4773]: I0121 16:23:38.901063 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 16:23:38 crc kubenswrapper[4773]: I0121 16:23:38.903680 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lbl2p" Jan 21 16:23:38 crc kubenswrapper[4773]: I0121 16:23:38.909568 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 21 16:23:39 crc kubenswrapper[4773]: I0121 16:23:39.063644 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h5cp\" (UniqueName: \"kubernetes.io/projected/8fd39fe4-18b4-4c46-966c-25d48d2c596f-kube-api-access-6h5cp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fd39fe4-18b4-4c46-966c-25d48d2c596f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 16:23:39 crc kubenswrapper[4773]: I0121 16:23:39.064127 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fd39fe4-18b4-4c46-966c-25d48d2c596f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 16:23:39 crc kubenswrapper[4773]: I0121 16:23:39.165801 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h5cp\" (UniqueName: \"kubernetes.io/projected/8fd39fe4-18b4-4c46-966c-25d48d2c596f-kube-api-access-6h5cp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fd39fe4-18b4-4c46-966c-25d48d2c596f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 16:23:39 crc kubenswrapper[4773]: I0121 16:23:39.165934 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fd39fe4-18b4-4c46-966c-25d48d2c596f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 16:23:39 crc kubenswrapper[4773]: I0121 16:23:39.166359 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fd39fe4-18b4-4c46-966c-25d48d2c596f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 16:23:39 crc kubenswrapper[4773]: I0121 16:23:39.185454 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h5cp\" (UniqueName: \"kubernetes.io/projected/8fd39fe4-18b4-4c46-966c-25d48d2c596f-kube-api-access-6h5cp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fd39fe4-18b4-4c46-966c-25d48d2c596f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 16:23:39 crc kubenswrapper[4773]: I0121 16:23:39.202232 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fd39fe4-18b4-4c46-966c-25d48d2c596f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 16:23:39 crc kubenswrapper[4773]: I0121 16:23:39.228537 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 16:23:39 crc kubenswrapper[4773]: I0121 16:23:39.690508 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 21 16:23:39 crc kubenswrapper[4773]: I0121 16:23:39.943918 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8fd39fe4-18b4-4c46-966c-25d48d2c596f","Type":"ContainerStarted","Data":"71b8ce148b971b0de0b3abe7d35ebb18cee45e5c3e039311c13569b02a760845"} Jan 21 16:23:41 crc kubenswrapper[4773]: I0121 16:23:41.963548 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8fd39fe4-18b4-4c46-966c-25d48d2c596f","Type":"ContainerStarted","Data":"95c11748b0b21ee0a93af12d1c4aaa004ff82255b16991930fcb3c82a85a0926"} Jan 21 16:23:41 crc kubenswrapper[4773]: I0121 16:23:41.994965 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.578866933 podStartE2EDuration="3.994940307s" podCreationTimestamp="2026-01-21 16:23:38 +0000 UTC" firstStartedPulling="2026-01-21 16:23:39.702239846 +0000 UTC m=+3584.626729468" lastFinishedPulling="2026-01-21 16:23:41.11831322 +0000 UTC m=+3586.042802842" observedRunningTime="2026-01-21 16:23:41.984793833 +0000 UTC m=+3586.909283455" watchObservedRunningTime="2026-01-21 16:23:41.994940307 +0000 UTC m=+3586.919429929" Jan 21 16:23:55 crc kubenswrapper[4773]: I0121 16:23:55.205818 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:23:55 crc kubenswrapper[4773]: I0121 16:23:55.206299 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:24:25 crc kubenswrapper[4773]: I0121 16:24:25.205876 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:24:25 crc kubenswrapper[4773]: I0121 16:24:25.206470 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.378386 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xh9kw/must-gather-vvdxp"] Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.381354 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/must-gather-vvdxp" Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.384283 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xh9kw"/"openshift-service-ca.crt" Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.384599 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xh9kw"/"kube-root-ca.crt" Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.385254 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xh9kw"/"default-dockercfg-wnbcn" Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.399144 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xh9kw/must-gather-vvdxp"] Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.510415 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d73b2e69-aa17-4bbd-8ba1-41e6cd49daed-must-gather-output\") pod \"must-gather-vvdxp\" (UID: \"d73b2e69-aa17-4bbd-8ba1-41e6cd49daed\") " pod="openshift-must-gather-xh9kw/must-gather-vvdxp" Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.510513 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-497kq\" (UniqueName: \"kubernetes.io/projected/d73b2e69-aa17-4bbd-8ba1-41e6cd49daed-kube-api-access-497kq\") pod \"must-gather-vvdxp\" (UID: \"d73b2e69-aa17-4bbd-8ba1-41e6cd49daed\") " pod="openshift-must-gather-xh9kw/must-gather-vvdxp" Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.613879 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d73b2e69-aa17-4bbd-8ba1-41e6cd49daed-must-gather-output\") pod \"must-gather-vvdxp\" (UID: \"d73b2e69-aa17-4bbd-8ba1-41e6cd49daed\") " pod="openshift-must-gather-xh9kw/must-gather-vvdxp" Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.614006 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-497kq\" (UniqueName: \"kubernetes.io/projected/d73b2e69-aa17-4bbd-8ba1-41e6cd49daed-kube-api-access-497kq\") pod \"must-gather-vvdxp\" (UID: \"d73b2e69-aa17-4bbd-8ba1-41e6cd49daed\") " pod="openshift-must-gather-xh9kw/must-gather-vvdxp" Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.615334 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d73b2e69-aa17-4bbd-8ba1-41e6cd49daed-must-gather-output\") pod \"must-gather-vvdxp\" (UID: \"d73b2e69-aa17-4bbd-8ba1-41e6cd49daed\") " pod="openshift-must-gather-xh9kw/must-gather-vvdxp" Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.640746 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-497kq\" (UniqueName: \"kubernetes.io/projected/d73b2e69-aa17-4bbd-8ba1-41e6cd49daed-kube-api-access-497kq\") pod \"must-gather-vvdxp\" (UID: \"d73b2e69-aa17-4bbd-8ba1-41e6cd49daed\") " pod="openshift-must-gather-xh9kw/must-gather-vvdxp" Jan 21 16:24:33 crc kubenswrapper[4773]: I0121 16:24:33.766881 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/must-gather-vvdxp" Jan 21 16:24:34 crc kubenswrapper[4773]: I0121 16:24:34.257501 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xh9kw/must-gather-vvdxp"] Jan 21 16:24:34 crc kubenswrapper[4773]: I0121 16:24:34.550552 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh9kw/must-gather-vvdxp" event={"ID":"d73b2e69-aa17-4bbd-8ba1-41e6cd49daed","Type":"ContainerStarted","Data":"fe614d08cf6ed38f53b53711a1fe856a59245509938f1b6ca2983b49228e9eee"} Jan 21 16:24:43 crc kubenswrapper[4773]: I0121 16:24:43.663141 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh9kw/must-gather-vvdxp" event={"ID":"d73b2e69-aa17-4bbd-8ba1-41e6cd49daed","Type":"ContainerStarted","Data":"95a1a74dea6dd190a61df01d5f4c8afec8f4ae404a203c0d3aba0e3e813d396b"} Jan 21 16:24:43 crc kubenswrapper[4773]: I0121 16:24:43.663928 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh9kw/must-gather-vvdxp" event={"ID":"d73b2e69-aa17-4bbd-8ba1-41e6cd49daed","Type":"ContainerStarted","Data":"ad1e0b609b6581762d9253fa4c0d388a9b0335f3d5dde9c6278796091ae090fa"} Jan 21 16:24:43 crc kubenswrapper[4773]: I0121 16:24:43.692851 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xh9kw/must-gather-vvdxp" podStartSLOduration=2.47268818 podStartE2EDuration="10.692832048s" podCreationTimestamp="2026-01-21 16:24:33 +0000 UTC" firstStartedPulling="2026-01-21 16:24:34.266507952 +0000 UTC m=+3639.190997574" lastFinishedPulling="2026-01-21 16:24:42.48665183 +0000 UTC m=+3647.411141442" observedRunningTime="2026-01-21 16:24:43.682773576 +0000 UTC m=+3648.607263208" watchObservedRunningTime="2026-01-21 16:24:43.692832048 +0000 UTC m=+3648.617321670" Jan 21 16:24:47 crc kubenswrapper[4773]: I0121 16:24:47.189416 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xh9kw/crc-debug-f46cn"] Jan 21 16:24:47 crc kubenswrapper[4773]: I0121 16:24:47.191859 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-f46cn" Jan 21 16:24:47 crc kubenswrapper[4773]: I0121 16:24:47.279428 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-host\") pod \"crc-debug-f46cn\" (UID: \"75ee650b-9398-448d-bc1a-2f7eacd9f2d9\") " pod="openshift-must-gather-xh9kw/crc-debug-f46cn" Jan 21 16:24:47 crc kubenswrapper[4773]: I0121 16:24:47.279539 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvn4r\" (UniqueName: \"kubernetes.io/projected/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-kube-api-access-hvn4r\") pod \"crc-debug-f46cn\" (UID: \"75ee650b-9398-448d-bc1a-2f7eacd9f2d9\") " pod="openshift-must-gather-xh9kw/crc-debug-f46cn" Jan 21 16:24:47 crc kubenswrapper[4773]: I0121 16:24:47.381751 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-host\") pod \"crc-debug-f46cn\" (UID: \"75ee650b-9398-448d-bc1a-2f7eacd9f2d9\") " pod="openshift-must-gather-xh9kw/crc-debug-f46cn" Jan 21 16:24:47 crc kubenswrapper[4773]: I0121 16:24:47.381906 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-host\") pod \"crc-debug-f46cn\" (UID: \"75ee650b-9398-448d-bc1a-2f7eacd9f2d9\") " pod="openshift-must-gather-xh9kw/crc-debug-f46cn" Jan 21 16:24:47 crc kubenswrapper[4773]: I0121 16:24:47.382297 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvn4r\" (UniqueName: \"kubernetes.io/projected/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-kube-api-access-hvn4r\") pod \"crc-debug-f46cn\" (UID: \"75ee650b-9398-448d-bc1a-2f7eacd9f2d9\") " pod="openshift-must-gather-xh9kw/crc-debug-f46cn" Jan 21 16:24:47 crc kubenswrapper[4773]: I0121 16:24:47.419269 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvn4r\" (UniqueName: \"kubernetes.io/projected/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-kube-api-access-hvn4r\") pod \"crc-debug-f46cn\" (UID: \"75ee650b-9398-448d-bc1a-2f7eacd9f2d9\") " pod="openshift-must-gather-xh9kw/crc-debug-f46cn" Jan 21 16:24:47 crc kubenswrapper[4773]: I0121 16:24:47.516443 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-f46cn" Jan 21 16:24:47 crc kubenswrapper[4773]: I0121 16:24:47.701643 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh9kw/crc-debug-f46cn" event={"ID":"75ee650b-9398-448d-bc1a-2f7eacd9f2d9","Type":"ContainerStarted","Data":"e372f2687bc383de3ac3a50b34f3465ec8ada7c263e7c5d2d6b6022be594b182"} Jan 21 16:24:50 crc kubenswrapper[4773]: I0121 16:24:50.441833 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_8f6a4485-3dde-469d-98ee-026edcc3eb76/alertmanager/0.log" Jan 21 16:24:50 crc kubenswrapper[4773]: I0121 16:24:50.449024 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_8f6a4485-3dde-469d-98ee-026edcc3eb76/config-reloader/0.log" Jan 21 16:24:50 crc kubenswrapper[4773]: I0121 16:24:50.455956 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_8f6a4485-3dde-469d-98ee-026edcc3eb76/init-config-reloader/0.log" Jan 21 16:24:50 crc kubenswrapper[4773]: I0121 16:24:50.514342 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5cf4556c4-hwkr9_8e7d6f73-a63d-40a4-acda-12edb288ec53/barbican-api-log/0.log" Jan 21 16:24:50 crc kubenswrapper[4773]: I0121 16:24:50.521659 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5cf4556c4-hwkr9_8e7d6f73-a63d-40a4-acda-12edb288ec53/barbican-api/0.log" Jan 21 16:24:50 crc kubenswrapper[4773]: I0121 16:24:50.671009 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d45c98f8b-b4vzj_36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d/barbican-keystone-listener-log/0.log" Jan 21 16:24:50 crc kubenswrapper[4773]: I0121 16:24:50.680415 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d45c98f8b-b4vzj_36d5f91b-6476-4d00-a9ed-fe1d2b4fe36d/barbican-keystone-listener/0.log" Jan 21 16:24:50 crc kubenswrapper[4773]: I0121 16:24:50.696977 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-794b46c66c-dj5lp_e7d5b884-1964-4585-a6b4-bd7813ee52c8/barbican-worker-log/0.log" Jan 21 16:24:50 crc kubenswrapper[4773]: I0121 16:24:50.705983 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-794b46c66c-dj5lp_e7d5b884-1964-4585-a6b4-bd7813ee52c8/barbican-worker/0.log" Jan 21 16:24:50 crc kubenswrapper[4773]: I0121 16:24:50.768829 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-bdfp6_449961cf-fcdf-4c35-9387-c3055f3364cb/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:24:50 crc kubenswrapper[4773]: I0121 16:24:50.967717 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61b8d46c-ed1d-4e8c-9d65-4c901fc300e4/ceilometer-central-agent/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.071745 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61b8d46c-ed1d-4e8c-9d65-4c901fc300e4/ceilometer-notification-agent/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.093168 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61b8d46c-ed1d-4e8c-9d65-4c901fc300e4/sg-core/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.105584 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_61b8d46c-ed1d-4e8c-9d65-4c901fc300e4/proxy-httpd/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.124482 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4453fa11-ade2-4d7d-a714-67525df64b70/cinder-api-log/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.166006 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4453fa11-ade2-4d7d-a714-67525df64b70/cinder-api/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.200055 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5c8c084f-2abc-435a-80fc-e8101b086e50/cinder-scheduler/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.221176 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5c8c084f-2abc-435a-80fc-e8101b086e50/probe/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.239189 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_7b666481-ecf0-4091-8d05-b403082294fe/cloudkitty-api-log/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.356083 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_7b666481-ecf0-4091-8d05-b403082294fe/cloudkitty-api/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.372987 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_44cd53f0-37e0-4b02-9922-49d99dfee92a/loki-compactor/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.399209 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-66dfd9bb-xkhx7_e87ce1ac-65bc-4e61-a72a-381b6f7653f9/loki-distributor/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.419011 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7db4f4db8c-6lg2s_45eeec38-a51c-4228-8616-335dc3b951b7/gateway/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.465829 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7db4f4db8c-bvhqr_3bd14b5f-6a17-41d2-bd18-522651121850/gateway/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.565394 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_9e04988a-e98a-4c9d-9a51-e0e69a6810c9/loki-index-gateway/0.log" Jan 21 16:24:51 crc kubenswrapper[4773]: I0121 16:24:51.706518 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_68c0e8c6-bc28-4101-a1d5-99ce639ae62c/loki-ingester/0.log" Jan 21 16:24:52 crc kubenswrapper[4773]: I0121 16:24:52.232044 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-795fd8f8cc-9886v_b00c4a6e-de8c-43d7-a190-bd512a63f9de/loki-querier/0.log" Jan 21 16:24:52 crc kubenswrapper[4773]: I0121 16:24:52.338587 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-5cd44666df-phpsd_103932c1-cb6f-4a90-9d70-0dcc1787a5b7/loki-query-frontend/0.log" Jan 21 16:24:55 crc kubenswrapper[4773]: I0121 16:24:55.205671 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:24:55 crc kubenswrapper[4773]: I0121 16:24:55.205919 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:24:55 crc kubenswrapper[4773]: I0121 16:24:55.205959 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 16:24:55 crc kubenswrapper[4773]: I0121 16:24:55.206639 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:24:55 crc kubenswrapper[4773]: I0121 16:24:55.206680 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" gracePeriod=600 Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.360833 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_aa4ebaf9-3bb5-4ec1-8a28-4b22e2d77109/cloudkitty-proc/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.433265 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-nj6bh_e1bb300f-6606-458a-aacc-3432b7ad314d/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: E0121 16:24:56.470571 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.476340 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-9r5gv_7031589c-e137-46d5-afdf-77044617bfa2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.515718 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-sm9b8_597cc973-99fd-42ab-99a3-1009ad011d10/dnsmasq-dns/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.535920 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-sm9b8_597cc973-99fd-42ab-99a3-1009ad011d10/init/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.586804 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-g8nsg_3726d208-3122-4e0e-a802-7f9b0c59621c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.610177 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4d7b9329-7502-4e45-bf22-cfe4d7f5451b/glance-log/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.633937 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4d7b9329-7502-4e45-bf22-cfe4d7f5451b/glance-httpd/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.653927 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_805f46cc-de71-4353-9cb3-075eb306ace0/glance-log/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.692932 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_805f46cc-de71-4353-9cb3-075eb306ace0/glance-httpd/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.722185 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gn864_1d844687-a8ab-4fab-8b3d-fb3210db5d86/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.752992 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-msv6h_6bb2f7be-feb4-4081-a299-204701555c02/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.842305 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" exitCode=0 Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.842350 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df"} Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.842382 4773 scope.go:117] "RemoveContainer" containerID="b6c1e82b06a68e14f8e95678571e3110eb2980afa4f616516fa87912eb096db9" Jan 21 16:24:56 crc kubenswrapper[4773]: I0121 16:24:56.843149 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:24:56 crc kubenswrapper[4773]: E0121 16:24:56.843441 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:24:57 crc kubenswrapper[4773]: I0121 16:24:57.133519 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6cb88dccdd-v7jgc_d5f9230a-2f00-48b8-bd84-4e080a4b907e/keystone-api/0.log" Jan 21 16:24:57 crc kubenswrapper[4773]: I0121 16:24:57.148228 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29483521-49spb_33ba1912-c7fc-40d8-b046-98d8d6e7931b/keystone-cron/0.log" Jan 21 16:24:57 crc kubenswrapper[4773]: I0121 16:24:57.173103 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_0182c704-9c2c-460e-8fb3-083edaa77855/kube-state-metrics/0.log" Jan 21 16:24:57 crc kubenswrapper[4773]: I0121 16:24:57.234901 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-c7qrl_c6a589bb-2c6a-48e3-80bd-daa3599ba7fd/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:25:06 crc kubenswrapper[4773]: E0121 16:25:06.675613 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Jan 21 16:25:06 crc kubenswrapper[4773]: E0121 16:25:06.676301 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hvn4r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-f46cn_openshift-must-gather-xh9kw(75ee650b-9398-448d-bc1a-2f7eacd9f2d9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 16:25:06 crc kubenswrapper[4773]: E0121 16:25:06.677525 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-xh9kw/crc-debug-f46cn" podUID="75ee650b-9398-448d-bc1a-2f7eacd9f2d9" Jan 21 16:25:06 crc kubenswrapper[4773]: E0121 16:25:06.963121 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-xh9kw/crc-debug-f46cn" podUID="75ee650b-9398-448d-bc1a-2f7eacd9f2d9" Jan 21 16:25:09 crc kubenswrapper[4773]: I0121 16:25:09.384011 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:25:09 crc kubenswrapper[4773]: E0121 16:25:09.384853 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:25:14 crc kubenswrapper[4773]: I0121 16:25:14.831442 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_eb62c429-ac2b-4654-84e2-c92b0508eba4/memcached/0.log" Jan 21 16:25:14 crc kubenswrapper[4773]: I0121 16:25:14.978483 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-746866d6b5-jbp68_711d6a0d-d24f-4d48-b73b-3b5418fe12bf/neutron-api/0.log" Jan 21 16:25:15 crc kubenswrapper[4773]: I0121 16:25:15.042077 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-746866d6b5-jbp68_711d6a0d-d24f-4d48-b73b-3b5418fe12bf/neutron-httpd/0.log" Jan 21 16:25:15 crc kubenswrapper[4773]: I0121 16:25:15.069744 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-brg4n_60b51255-6cb0-404b-9431-a04ded467081/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:25:15 crc kubenswrapper[4773]: I0121 16:25:15.239122 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ead6527b-43a9-4f30-a682-b5e5bd25207e/nova-api-log/0.log" Jan 21 16:25:15 crc kubenswrapper[4773]: I0121 16:25:15.336715 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55d66b9568-sfgsj_e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20/manager/0.log" Jan 21 16:25:15 crc kubenswrapper[4773]: I0121 16:25:15.343330 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55d66b9568-sfgsj_e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20/kube-rbac-proxy/0.log" Jan 21 16:25:15 crc kubenswrapper[4773]: I0121 16:25:15.510365 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ead6527b-43a9-4f30-a682-b5e5bd25207e/nova-api-api/0.log" Jan 21 16:25:15 crc kubenswrapper[4773]: I0121 16:25:15.619439 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_aa487edb-f8c0-439e-af92-83c72235393e/nova-cell0-conductor-conductor/0.log" Jan 21 16:25:15 crc kubenswrapper[4773]: I0121 16:25:15.713267 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8985a73a-071c-41cd-9828-d74a631c7606/nova-cell1-conductor-conductor/0.log" Jan 21 16:25:15 crc kubenswrapper[4773]: I0121 16:25:15.811378 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c380503c-e5d5-45c3-aeea-5997a1e792c5/nova-cell1-novncproxy-novncproxy/0.log" Jan 21 16:25:15 crc kubenswrapper[4773]: I0121 16:25:15.865214 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-cvdbt_88d600dd-1b0b-4e33-a91f-4375318fdc5f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:25:15 crc kubenswrapper[4773]: I0121 16:25:15.947910 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a/nova-metadata-log/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.640646 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7fa4e0c1-4ca6-40a8-8eb4-6d8a5193762a/nova-metadata-metadata/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.734709 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9dbed76e-56cf-4a05-b29a-0e2bc8454441/nova-scheduler-scheduler/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.756464 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_869ad9c0-3593-4ebc-9b58-7b9615e46927/galera/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.766755 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_869ad9c0-3593-4ebc-9b58-7b9615e46927/mysql-bootstrap/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.797267 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32888aa3-cb52-484f-9745-5d5dfc5179df/galera/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.809019 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32888aa3-cb52-484f-9745-5d5dfc5179df/mysql-bootstrap/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.831910 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d34079f2-2d08-4ddc-8d49-a9afaadaba8c/openstackclient/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.851811 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8t22l_3ee4f520-ec7d-4b0e-8138-ef2fbcc559b7/openstack-network-exporter/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.875962 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mvwkv_b457bfe0-3f48-4e19-88a8-2b1ccefa549f/ovsdb-server/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.887026 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mvwkv_b457bfe0-3f48-4e19-88a8-2b1ccefa549f/ovs-vswitchd/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.897059 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mvwkv_b457bfe0-3f48-4e19-88a8-2b1ccefa549f/ovsdb-server-init/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.918945 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zwrfs_1f582857-cae4-4fa2-896d-b763b224ad8e/ovn-controller/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.951346 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-vsxqt_0e29da82-0979-40a0-8b48-4ba06d87fd14/controller/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.957126 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-vsxqt_0e29da82-0979-40a0-8b48-4ba06d87fd14/kube-rbac-proxy/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.961350 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-htv6p_1ab3d038-3af9-4719-872d-fc431de9959b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.975841 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fd966624-79ad-4926-9253-741b8f1e6fe4/ovn-northd/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.978717 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/controller/0.log" Jan 21 16:25:16 crc kubenswrapper[4773]: I0121 16:25:16.982741 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fd966624-79ad-4926-9253-741b8f1e6fe4/openstack-network-exporter/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.020855 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_088efb4e-fd31-4648-88cf-ceac1edb1723/ovsdbserver-nb/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.027653 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_088efb4e-fd31-4648-88cf-ceac1edb1723/openstack-network-exporter/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.067518 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7656b240-dd39-4bd3-8cdb-2f5103f17656/ovsdbserver-sb/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.081224 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7656b240-dd39-4bd3-8cdb-2f5103f17656/openstack-network-exporter/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.231161 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5cf6c78dd-68gm6_b54ed186-3f20-46df-8d62-e6a4daa84fed/placement-log/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.346088 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5cf6c78dd-68gm6_b54ed186-3f20-46df-8d62-e6a4daa84fed/placement-api/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.389725 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8/prometheus/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.397937 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8/config-reloader/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.408955 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8/thanos-sidecar/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.421425 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_16ae82b0-cc5c-4e5d-9f49-55a813ffbfe8/init-config-reloader/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.451987 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_13e55b8e-491d-4d97-a0cf-56433eb4a7f1/rabbitmq/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.466198 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_13e55b8e-491d-4d97-a0cf-56433eb4a7f1/setup-container/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.520055 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c80bcc16-be97-47d8-afb6-7a1378546882/rabbitmq/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.530483 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c80bcc16-be97-47d8-afb6-7a1378546882/setup-container/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.556234 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-s2qkg_5d404546-874a-474a-ac90-b6be34ed0420/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.569163 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pn58d_c150a8ca-3fb1-431a-b4a8-1dabb0dcdf17/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.593229 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-j7brx_246d4cbd-1a69-4a02-99ab-f716001d4e67/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.703507 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zmlnd_545811bf-853d-41fb-847b-8a483a017894/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.725789 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9xcxn_acf5cb49-230b-4c79-b383-3ea958daeede/ssh-known-hosts-edpm-deployment/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.897762 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6dcbb8dcff-bjhnc_22988650-1474-4ba4-a6c0-2deb003ae3e7/proxy-httpd/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.922475 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6dcbb8dcff-bjhnc_22988650-1474-4ba4-a6c0-2deb003ae3e7/proxy-server/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.935463 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-k6w2q_626ecec5-3380-45fa-a2b1-248ee0af1328/swift-ring-rebalance/0.log" Jan 21 16:25:17 crc kubenswrapper[4773]: I0121 16:25:17.986228 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/account-server/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.014790 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/account-replicator/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.024469 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/account-auditor/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.038584 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/account-reaper/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.053268 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/container-server/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.092200 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/container-replicator/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.099243 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/container-auditor/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.108177 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/container-updater/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.123482 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/object-server/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.140402 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/object-replicator/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.159873 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/object-auditor/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.169111 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/object-updater/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.177993 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/object-expirer/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.186560 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/rsync/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.196645 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_111512a9-4e17-4433-a7e9-e8666099d12f/swift-recon-cron/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.284179 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-z5vn6_95dfd2a7-6742-4cd5-8d1e-144e1b176a4c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.324807 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_aee2faa5-89c4-4798-aebe-1d27e3f9861f/tempest-tests-tempest-tests-runner/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.332613 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_8fd39fe4-18b4-4c46-966c-25d48d2c596f/test-operator-logs-container/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.345600 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-4b626_58070518-459d-4437-8aa7-7a532264b18d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.712229 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/frr/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.722797 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/reloader/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.730259 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/frr-metrics/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.739678 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/kube-rbac-proxy/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.747598 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/kube-rbac-proxy-frr/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.753788 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/cp-frr-files/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.769206 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/cp-reloader/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.779947 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/cp-metrics/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.792431 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-bqp84_1d536346-20d9-48b7-92d2-dd043c7cca4a/frr-k8s-webhook-server/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.825194 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5dcc476cd5-nk2zw_bfc5861c-71cc-4485-8fbd-cc661354fe03/manager/0.log" Jan 21 16:25:18 crc kubenswrapper[4773]: I0121 16:25:18.838023 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5d664df476-46xlp_222e3273-8373-410d-b8f1-fe19aa307ed5/webhook-server/0.log" Jan 21 16:25:19 crc kubenswrapper[4773]: I0121 16:25:19.220175 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wptww_0968840f-f0d5-4b41-8f6f-00b88d26758e/speaker/0.log" Jan 21 16:25:19 crc kubenswrapper[4773]: I0121 16:25:19.229333 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wptww_0968840f-f0d5-4b41-8f6f-00b88d26758e/kube-rbac-proxy/0.log" Jan 21 16:25:20 crc kubenswrapper[4773]: I0121 16:25:20.384063 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:25:20 crc kubenswrapper[4773]: E0121 16:25:20.387330 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:25:22 crc kubenswrapper[4773]: I0121 16:25:22.146379 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh9kw/crc-debug-f46cn" event={"ID":"75ee650b-9398-448d-bc1a-2f7eacd9f2d9","Type":"ContainerStarted","Data":"9d5fbe322cf265dac523acc8b611b9867572e5245997c3c4f438d4f0bb6c7830"} Jan 21 16:25:22 crc kubenswrapper[4773]: I0121 16:25:22.164403 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xh9kw/crc-debug-f46cn" podStartSLOduration=1.81993322 podStartE2EDuration="35.164386703s" podCreationTimestamp="2026-01-21 16:24:47 +0000 UTC" firstStartedPulling="2026-01-21 16:24:47.563368029 +0000 UTC m=+3652.487857651" lastFinishedPulling="2026-01-21 16:25:20.907821512 +0000 UTC m=+3685.832311134" observedRunningTime="2026-01-21 16:25:22.162353818 +0000 UTC m=+3687.086843440" watchObservedRunningTime="2026-01-21 16:25:22.164386703 +0000 UTC m=+3687.088876325" Jan 21 16:25:33 crc kubenswrapper[4773]: I0121 16:25:33.383608 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:25:33 crc kubenswrapper[4773]: E0121 16:25:33.384311 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:25:36 crc kubenswrapper[4773]: I0121 16:25:36.483955 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5_a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44/extract/0.log" Jan 21 16:25:36 crc kubenswrapper[4773]: I0121 16:25:36.493645 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5_a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44/util/0.log" Jan 21 16:25:36 crc kubenswrapper[4773]: I0121 16:25:36.503542 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5_a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44/pull/0.log" Jan 21 16:25:36 crc kubenswrapper[4773]: I0121 16:25:36.584749 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-h8g4t_330099a5-d43b-482e-a4cb-e6c3bb2c6706/manager/0.log" Jan 21 16:25:36 crc kubenswrapper[4773]: I0121 16:25:36.631411 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-lvnp8_8678664b-38d8-4482-ae3d-fa1a74a709fd/manager/0.log" Jan 21 16:25:36 crc kubenswrapper[4773]: I0121 16:25:36.641753 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-s7bsl_516d7adc-2317-406b-92ef-6ed5a74a74b3/manager/0.log" Jan 21 16:25:36 crc kubenswrapper[4773]: I0121 16:25:36.725553 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-7kgnl_ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c/manager/0.log" Jan 21 16:25:36 crc kubenswrapper[4773]: I0121 16:25:36.740347 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-hpqt5_8a54ffaf-3268-4696-952e-ee6381310628/manager/0.log" Jan 21 16:25:36 crc kubenswrapper[4773]: I0121 16:25:36.756663 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-wrww2_50f1e60f-1194-428b-b7e2-ccf0ebb384c7/manager/0.log" Jan 21 16:25:37 crc kubenswrapper[4773]: I0121 16:25:37.128886 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-pldbp_fdfe2fce-12c1-4026-b40f-77234a609986/manager/0.log" Jan 21 16:25:37 crc kubenswrapper[4773]: I0121 16:25:37.142547 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-c79x5_32f1de73-4ee0-4eda-8709-d1642d8452f2/manager/0.log" Jan 21 16:25:37 crc kubenswrapper[4773]: I0121 16:25:37.209803 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-fcbjv_d60c449c-a583-4b8e-8265-9df068220041/manager/0.log" Jan 21 16:25:37 crc kubenswrapper[4773]: I0121 16:25:37.222119 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-xdcfb_2264dd36-5855-49cc-bf31-1d1e9dcb1f9f/manager/0.log" Jan 21 16:25:37 crc kubenswrapper[4773]: I0121 16:25:37.279223 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-jz6m4_43aaba2b-296a-407d-9ea2-bbf4c05e868e/manager/0.log" Jan 21 16:25:37 crc kubenswrapper[4773]: I0121 16:25:37.332845 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-d9k6n_9a48a802-404e-4a60-821b-8b91a4830da8/manager/0.log" Jan 21 16:25:37 crc kubenswrapper[4773]: I0121 16:25:37.429572 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-nwvmf_b7286f1c-434c-4ebb-9d2a-54a6596a63b5/manager/0.log" Jan 21 16:25:37 crc kubenswrapper[4773]: I0121 16:25:37.441116 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-kn644_ddac57c3-b102-4cfc-8b1e-53de342cef39/manager/0.log" Jan 21 16:25:37 crc kubenswrapper[4773]: I0121 16:25:37.463607 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz_d8cb173b-7eaa-4183-8028-0a1c4730097c/manager/0.log" Jan 21 16:25:37 crc kubenswrapper[4773]: I0121 16:25:37.599967 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-57bcf57cd7-v96fc_baf015b3-f5b5-4467-8469-bccd49ba94ae/operator/0.log" Jan 21 16:25:38 crc kubenswrapper[4773]: I0121 16:25:38.812911 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7c6777f5bd-z474b_83918de1-f089-46b5-99e4-b249fbe09d65/manager/0.log" Jan 21 16:25:38 crc kubenswrapper[4773]: I0121 16:25:38.892685 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tzlwr_219ae24e-95b5-4a93-b89b-335ef51b2166/registry-server/0.log" Jan 21 16:25:38 crc kubenswrapper[4773]: I0121 16:25:38.946813 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-cn7gp_0f906590-d519-4724-bc67-05c6b3a9191d/manager/0.log" Jan 21 16:25:38 crc kubenswrapper[4773]: I0121 16:25:38.980096 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-2bt26_6411a5d3-7b7b-4735-b01c-7c4aa0d5509c/manager/0.log" Jan 21 16:25:39 crc kubenswrapper[4773]: I0121 16:25:39.004062 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dp2s7_02b38141-e855-4eeb-ac52-d135fb5f44f7/operator/0.log" Jan 21 16:25:39 crc kubenswrapper[4773]: I0121 16:25:39.036616 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-xs55j_7ab140ad-f64b-45e3-a393-f66567e98a9f/manager/0.log" Jan 21 16:25:39 crc kubenswrapper[4773]: I0121 16:25:39.427907 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5c4ff57dc8-78tss_4a9d0079-9636-4913-95fd-305e8d54280d/manager/0.log" Jan 21 16:25:39 crc kubenswrapper[4773]: I0121 16:25:39.437319 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-94knv_eeb5c272-4544-47a4-8d08-187872fea7bd/manager/0.log" Jan 21 16:25:39 crc kubenswrapper[4773]: I0121 16:25:39.451799 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-2wbz9_d7b52472-5c30-471f-a937-c50d96103339/manager/0.log" Jan 21 16:25:46 crc kubenswrapper[4773]: I0121 16:25:46.456745 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7pqk4_cafc9bd5-4993-4fcf-ba6d-91028b10e7e8/control-plane-machine-set-operator/0.log" Jan 21 16:25:46 crc kubenswrapper[4773]: I0121 16:25:46.474355 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t6c6p_569e4aab-6b67-4448-9e6e-ecab14ebc87e/kube-rbac-proxy/0.log" Jan 21 16:25:46 crc kubenswrapper[4773]: I0121 16:25:46.484545 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t6c6p_569e4aab-6b67-4448-9e6e-ecab14ebc87e/machine-api-operator/0.log" Jan 21 16:25:47 crc kubenswrapper[4773]: I0121 16:25:47.383950 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:25:47 crc kubenswrapper[4773]: E0121 16:25:47.384587 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:25:58 crc kubenswrapper[4773]: I0121 16:25:58.385422 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:25:58 crc kubenswrapper[4773]: E0121 16:25:58.386180 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.214624 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b9sn4"] Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.218054 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.229845 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9sn4"] Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.333498 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pbj9\" (UniqueName: \"kubernetes.io/projected/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-kube-api-access-9pbj9\") pod \"redhat-marketplace-b9sn4\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.333840 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-catalog-content\") pod \"redhat-marketplace-b9sn4\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.333997 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-utilities\") pod \"redhat-marketplace-b9sn4\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.436442 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pbj9\" (UniqueName: \"kubernetes.io/projected/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-kube-api-access-9pbj9\") pod \"redhat-marketplace-b9sn4\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.436780 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-catalog-content\") pod \"redhat-marketplace-b9sn4\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.436912 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-utilities\") pod \"redhat-marketplace-b9sn4\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.437216 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-catalog-content\") pod \"redhat-marketplace-b9sn4\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.437247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-utilities\") pod \"redhat-marketplace-b9sn4\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.459353 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pbj9\" (UniqueName: \"kubernetes.io/projected/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-kube-api-access-9pbj9\") pod \"redhat-marketplace-b9sn4\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:06 crc kubenswrapper[4773]: I0121 16:26:06.553560 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:07 crc kubenswrapper[4773]: I0121 16:26:07.105608 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9sn4"] Jan 21 16:26:07 crc kubenswrapper[4773]: I0121 16:26:07.593379 4773 generic.go:334] "Generic (PLEG): container finished" podID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" containerID="5be7d25571f67e3598850bae83eae068a44ee8aa7ac86903b79cdbfe4724b0be" exitCode=0 Jan 21 16:26:07 crc kubenswrapper[4773]: I0121 16:26:07.593538 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9sn4" event={"ID":"a68db5d1-f29f-4560-b2d5-1638b5ef0a41","Type":"ContainerDied","Data":"5be7d25571f67e3598850bae83eae068a44ee8aa7ac86903b79cdbfe4724b0be"} Jan 21 16:26:07 crc kubenswrapper[4773]: I0121 16:26:07.593826 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9sn4" event={"ID":"a68db5d1-f29f-4560-b2d5-1638b5ef0a41","Type":"ContainerStarted","Data":"2f397a97adc14ff5de8e65b0de1201a5134469632380ed302bb26cc4b1e2669b"} Jan 21 16:26:09 crc kubenswrapper[4773]: I0121 16:26:09.637507 4773 generic.go:334] "Generic (PLEG): container finished" podID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" containerID="855bdd8775fb583a7024b0557bda9b421fcabad3b5988bd89ba70b62a5b341ca" exitCode=0 Jan 21 16:26:09 crc kubenswrapper[4773]: I0121 16:26:09.637729 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9sn4" event={"ID":"a68db5d1-f29f-4560-b2d5-1638b5ef0a41","Type":"ContainerDied","Data":"855bdd8775fb583a7024b0557bda9b421fcabad3b5988bd89ba70b62a5b341ca"} Jan 21 16:26:11 crc kubenswrapper[4773]: I0121 16:26:11.384131 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:26:11 crc kubenswrapper[4773]: E0121 16:26:11.384595 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:26:11 crc kubenswrapper[4773]: I0121 16:26:11.669679 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9sn4" event={"ID":"a68db5d1-f29f-4560-b2d5-1638b5ef0a41","Type":"ContainerStarted","Data":"fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c"} Jan 21 16:26:11 crc kubenswrapper[4773]: I0121 16:26:11.696717 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b9sn4" podStartSLOduration=3.267861682 podStartE2EDuration="5.696663513s" podCreationTimestamp="2026-01-21 16:26:06 +0000 UTC" firstStartedPulling="2026-01-21 16:26:07.595261549 +0000 UTC m=+3732.519751171" lastFinishedPulling="2026-01-21 16:26:10.02406338 +0000 UTC m=+3734.948553002" observedRunningTime="2026-01-21 16:26:11.694506924 +0000 UTC m=+3736.618996576" watchObservedRunningTime="2026-01-21 16:26:11.696663513 +0000 UTC m=+3736.621153135" Jan 21 16:26:16 crc kubenswrapper[4773]: I0121 16:26:16.554719 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:16 crc kubenswrapper[4773]: I0121 16:26:16.555338 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:16 crc kubenswrapper[4773]: I0121 16:26:16.605426 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:16 crc kubenswrapper[4773]: I0121 16:26:16.715430 4773 generic.go:334] "Generic (PLEG): container finished" podID="75ee650b-9398-448d-bc1a-2f7eacd9f2d9" containerID="9d5fbe322cf265dac523acc8b611b9867572e5245997c3c4f438d4f0bb6c7830" exitCode=0 Jan 21 16:26:16 crc kubenswrapper[4773]: I0121 16:26:16.715534 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh9kw/crc-debug-f46cn" event={"ID":"75ee650b-9398-448d-bc1a-2f7eacd9f2d9","Type":"ContainerDied","Data":"9d5fbe322cf265dac523acc8b611b9867572e5245997c3c4f438d4f0bb6c7830"} Jan 21 16:26:16 crc kubenswrapper[4773]: I0121 16:26:16.764708 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:16 crc kubenswrapper[4773]: I0121 16:26:16.846440 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9sn4"] Jan 21 16:26:17 crc kubenswrapper[4773]: I0121 16:26:17.842944 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-f46cn" Jan 21 16:26:17 crc kubenswrapper[4773]: I0121 16:26:17.877730 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xh9kw/crc-debug-f46cn"] Jan 21 16:26:17 crc kubenswrapper[4773]: I0121 16:26:17.886868 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xh9kw/crc-debug-f46cn"] Jan 21 16:26:17 crc kubenswrapper[4773]: I0121 16:26:17.923682 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-host\") pod \"75ee650b-9398-448d-bc1a-2f7eacd9f2d9\" (UID: \"75ee650b-9398-448d-bc1a-2f7eacd9f2d9\") " Jan 21 16:26:17 crc kubenswrapper[4773]: I0121 16:26:17.923879 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-host" (OuterVolumeSpecName: "host") pod "75ee650b-9398-448d-bc1a-2f7eacd9f2d9" (UID: "75ee650b-9398-448d-bc1a-2f7eacd9f2d9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:26:17 crc kubenswrapper[4773]: I0121 16:26:17.924401 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvn4r\" (UniqueName: \"kubernetes.io/projected/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-kube-api-access-hvn4r\") pod \"75ee650b-9398-448d-bc1a-2f7eacd9f2d9\" (UID: \"75ee650b-9398-448d-bc1a-2f7eacd9f2d9\") " Jan 21 16:26:17 crc kubenswrapper[4773]: I0121 16:26:17.925086 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-host\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:17 crc kubenswrapper[4773]: I0121 16:26:17.930999 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-kube-api-access-hvn4r" (OuterVolumeSpecName: "kube-api-access-hvn4r") pod "75ee650b-9398-448d-bc1a-2f7eacd9f2d9" (UID: "75ee650b-9398-448d-bc1a-2f7eacd9f2d9"). InnerVolumeSpecName "kube-api-access-hvn4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:18 crc kubenswrapper[4773]: I0121 16:26:18.027628 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvn4r\" (UniqueName: \"kubernetes.io/projected/75ee650b-9398-448d-bc1a-2f7eacd9f2d9-kube-api-access-hvn4r\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:18 crc kubenswrapper[4773]: I0121 16:26:18.736371 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-f46cn" Jan 21 16:26:18 crc kubenswrapper[4773]: I0121 16:26:18.736370 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e372f2687bc383de3ac3a50b34f3465ec8ada7c263e7c5d2d6b6022be594b182" Jan 21 16:26:18 crc kubenswrapper[4773]: I0121 16:26:18.736494 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b9sn4" podUID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" containerName="registry-server" containerID="cri-o://fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c" gracePeriod=2 Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.063999 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xh9kw/crc-debug-trzgb"] Jan 21 16:26:19 crc kubenswrapper[4773]: E0121 16:26:19.064439 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ee650b-9398-448d-bc1a-2f7eacd9f2d9" containerName="container-00" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.064452 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ee650b-9398-448d-bc1a-2f7eacd9f2d9" containerName="container-00" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.064671 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ee650b-9398-448d-bc1a-2f7eacd9f2d9" containerName="container-00" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.065434 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-trzgb" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.149476 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxr4p\" (UniqueName: \"kubernetes.io/projected/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-kube-api-access-sxr4p\") pod \"crc-debug-trzgb\" (UID: \"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5\") " pod="openshift-must-gather-xh9kw/crc-debug-trzgb" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.149874 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-host\") pod \"crc-debug-trzgb\" (UID: \"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5\") " pod="openshift-must-gather-xh9kw/crc-debug-trzgb" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.251791 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxr4p\" (UniqueName: \"kubernetes.io/projected/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-kube-api-access-sxr4p\") pod \"crc-debug-trzgb\" (UID: \"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5\") " pod="openshift-must-gather-xh9kw/crc-debug-trzgb" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.251844 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-host\") pod \"crc-debug-trzgb\" (UID: \"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5\") " pod="openshift-must-gather-xh9kw/crc-debug-trzgb" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.252001 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-host\") pod \"crc-debug-trzgb\" (UID: \"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5\") " pod="openshift-must-gather-xh9kw/crc-debug-trzgb" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.268966 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxr4p\" (UniqueName: \"kubernetes.io/projected/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-kube-api-access-sxr4p\") pod \"crc-debug-trzgb\" (UID: \"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5\") " pod="openshift-must-gather-xh9kw/crc-debug-trzgb" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.385273 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-trzgb" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.402755 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75ee650b-9398-448d-bc1a-2f7eacd9f2d9" path="/var/lib/kubelet/pods/75ee650b-9398-448d-bc1a-2f7eacd9f2d9/volumes" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.522078 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.661498 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-utilities\") pod \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.661654 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-catalog-content\") pod \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.661942 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pbj9\" (UniqueName: \"kubernetes.io/projected/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-kube-api-access-9pbj9\") pod \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\" (UID: \"a68db5d1-f29f-4560-b2d5-1638b5ef0a41\") " Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.662480 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-utilities" (OuterVolumeSpecName: "utilities") pod "a68db5d1-f29f-4560-b2d5-1638b5ef0a41" (UID: "a68db5d1-f29f-4560-b2d5-1638b5ef0a41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.666306 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-kube-api-access-9pbj9" (OuterVolumeSpecName: "kube-api-access-9pbj9") pod "a68db5d1-f29f-4560-b2d5-1638b5ef0a41" (UID: "a68db5d1-f29f-4560-b2d5-1638b5ef0a41"). InnerVolumeSpecName "kube-api-access-9pbj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.685950 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a68db5d1-f29f-4560-b2d5-1638b5ef0a41" (UID: "a68db5d1-f29f-4560-b2d5-1638b5ef0a41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.756381 4773 generic.go:334] "Generic (PLEG): container finished" podID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" containerID="fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c" exitCode=0 Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.756442 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9sn4" event={"ID":"a68db5d1-f29f-4560-b2d5-1638b5ef0a41","Type":"ContainerDied","Data":"fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c"} Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.756500 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9sn4" event={"ID":"a68db5d1-f29f-4560-b2d5-1638b5ef0a41","Type":"ContainerDied","Data":"2f397a97adc14ff5de8e65b0de1201a5134469632380ed302bb26cc4b1e2669b"} Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.756524 4773 scope.go:117] "RemoveContainer" containerID="fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.756461 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9sn4" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.760389 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh9kw/crc-debug-trzgb" event={"ID":"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5","Type":"ContainerStarted","Data":"b289e11862c71c164ddd5a6297de58b16e4077e1ef91f7a01dfa8d405156d843"} Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.760420 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh9kw/crc-debug-trzgb" event={"ID":"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5","Type":"ContainerStarted","Data":"c8831dfbad58f48dd61726b562febf9c516b84ba3d4d842f9fa831efbf247364"} Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.764111 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pbj9\" (UniqueName: \"kubernetes.io/projected/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-kube-api-access-9pbj9\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.764147 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.764159 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a68db5d1-f29f-4560-b2d5-1638b5ef0a41-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.781324 4773 scope.go:117] "RemoveContainer" containerID="855bdd8775fb583a7024b0557bda9b421fcabad3b5988bd89ba70b62a5b341ca" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.787895 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xh9kw/crc-debug-trzgb" podStartSLOduration=0.787878882 podStartE2EDuration="787.878882ms" podCreationTimestamp="2026-01-21 16:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:26:19.774272054 +0000 UTC m=+3744.698761706" watchObservedRunningTime="2026-01-21 16:26:19.787878882 +0000 UTC m=+3744.712368504" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.845894 4773 scope.go:117] "RemoveContainer" containerID="5be7d25571f67e3598850bae83eae068a44ee8aa7ac86903b79cdbfe4724b0be" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.850020 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9sn4"] Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.863565 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9sn4"] Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.915403 4773 scope.go:117] "RemoveContainer" containerID="fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c" Jan 21 16:26:19 crc kubenswrapper[4773]: E0121 16:26:19.916008 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c\": container with ID starting with fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c not found: ID does not exist" containerID="fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.916063 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c"} err="failed to get container status \"fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c\": rpc error: code = NotFound desc = could not find container \"fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c\": container with ID starting with fe483fc20f028c2507137a0f1dc15411a9dba4ce4fdc35a7bc93fe2a7357913c not found: ID does not exist" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.916097 4773 scope.go:117] "RemoveContainer" containerID="855bdd8775fb583a7024b0557bda9b421fcabad3b5988bd89ba70b62a5b341ca" Jan 21 16:26:19 crc kubenswrapper[4773]: E0121 16:26:19.916452 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"855bdd8775fb583a7024b0557bda9b421fcabad3b5988bd89ba70b62a5b341ca\": container with ID starting with 855bdd8775fb583a7024b0557bda9b421fcabad3b5988bd89ba70b62a5b341ca not found: ID does not exist" containerID="855bdd8775fb583a7024b0557bda9b421fcabad3b5988bd89ba70b62a5b341ca" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.916507 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"855bdd8775fb583a7024b0557bda9b421fcabad3b5988bd89ba70b62a5b341ca"} err="failed to get container status \"855bdd8775fb583a7024b0557bda9b421fcabad3b5988bd89ba70b62a5b341ca\": rpc error: code = NotFound desc = could not find container \"855bdd8775fb583a7024b0557bda9b421fcabad3b5988bd89ba70b62a5b341ca\": container with ID starting with 855bdd8775fb583a7024b0557bda9b421fcabad3b5988bd89ba70b62a5b341ca not found: ID does not exist" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.916533 4773 scope.go:117] "RemoveContainer" containerID="5be7d25571f67e3598850bae83eae068a44ee8aa7ac86903b79cdbfe4724b0be" Jan 21 16:26:19 crc kubenswrapper[4773]: E0121 16:26:19.916778 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5be7d25571f67e3598850bae83eae068a44ee8aa7ac86903b79cdbfe4724b0be\": container with ID starting with 5be7d25571f67e3598850bae83eae068a44ee8aa7ac86903b79cdbfe4724b0be not found: ID does not exist" containerID="5be7d25571f67e3598850bae83eae068a44ee8aa7ac86903b79cdbfe4724b0be" Jan 21 16:26:19 crc kubenswrapper[4773]: I0121 16:26:19.916809 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5be7d25571f67e3598850bae83eae068a44ee8aa7ac86903b79cdbfe4724b0be"} err="failed to get container status \"5be7d25571f67e3598850bae83eae068a44ee8aa7ac86903b79cdbfe4724b0be\": rpc error: code = NotFound desc = could not find container \"5be7d25571f67e3598850bae83eae068a44ee8aa7ac86903b79cdbfe4724b0be\": container with ID starting with 5be7d25571f67e3598850bae83eae068a44ee8aa7ac86903b79cdbfe4724b0be not found: ID does not exist" Jan 21 16:26:20 crc kubenswrapper[4773]: I0121 16:26:20.781769 4773 generic.go:334] "Generic (PLEG): container finished" podID="40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5" containerID="b289e11862c71c164ddd5a6297de58b16e4077e1ef91f7a01dfa8d405156d843" exitCode=0 Jan 21 16:26:20 crc kubenswrapper[4773]: I0121 16:26:20.782119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh9kw/crc-debug-trzgb" event={"ID":"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5","Type":"ContainerDied","Data":"b289e11862c71c164ddd5a6297de58b16e4077e1ef91f7a01dfa8d405156d843"} Jan 21 16:26:21 crc kubenswrapper[4773]: I0121 16:26:21.399300 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" path="/var/lib/kubelet/pods/a68db5d1-f29f-4560-b2d5-1638b5ef0a41/volumes" Jan 21 16:26:21 crc kubenswrapper[4773]: I0121 16:26:21.941601 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-trzgb" Jan 21 16:26:21 crc kubenswrapper[4773]: I0121 16:26:21.978419 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xh9kw/crc-debug-trzgb"] Jan 21 16:26:21 crc kubenswrapper[4773]: I0121 16:26:21.989561 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xh9kw/crc-debug-trzgb"] Jan 21 16:26:22 crc kubenswrapper[4773]: I0121 16:26:22.124753 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxr4p\" (UniqueName: \"kubernetes.io/projected/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-kube-api-access-sxr4p\") pod \"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5\" (UID: \"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5\") " Jan 21 16:26:22 crc kubenswrapper[4773]: I0121 16:26:22.125218 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-host\") pod \"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5\" (UID: \"40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5\") " Jan 21 16:26:22 crc kubenswrapper[4773]: I0121 16:26:22.125319 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-host" (OuterVolumeSpecName: "host") pod "40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5" (UID: "40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:26:22 crc kubenswrapper[4773]: I0121 16:26:22.126190 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-host\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:22 crc kubenswrapper[4773]: I0121 16:26:22.129447 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-kube-api-access-sxr4p" (OuterVolumeSpecName: "kube-api-access-sxr4p") pod "40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5" (UID: "40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5"). InnerVolumeSpecName "kube-api-access-sxr4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:22 crc kubenswrapper[4773]: I0121 16:26:22.228113 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxr4p\" (UniqueName: \"kubernetes.io/projected/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5-kube-api-access-sxr4p\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:22 crc kubenswrapper[4773]: I0121 16:26:22.800950 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8831dfbad58f48dd61726b562febf9c516b84ba3d4d842f9fa831efbf247364" Jan 21 16:26:22 crc kubenswrapper[4773]: I0121 16:26:22.800996 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-trzgb" Jan 21 16:26:22 crc kubenswrapper[4773]: E0121 16:26:22.888507 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40aa427d_a9ea_42e1_a5d7_2bd19c82bfe5.slice\": RecentStats: unable to find data in memory cache]" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.148424 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xh9kw/crc-debug-gdljj"] Jan 21 16:26:23 crc kubenswrapper[4773]: E0121 16:26:23.149305 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" containerName="registry-server" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.149319 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" containerName="registry-server" Jan 21 16:26:23 crc kubenswrapper[4773]: E0121 16:26:23.149331 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" containerName="extract-utilities" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.149337 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" containerName="extract-utilities" Jan 21 16:26:23 crc kubenswrapper[4773]: E0121 16:26:23.149346 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5" containerName="container-00" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.149353 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5" containerName="container-00" Jan 21 16:26:23 crc kubenswrapper[4773]: E0121 16:26:23.149364 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" containerName="extract-content" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.149369 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" containerName="extract-content" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.149566 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5" containerName="container-00" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.149591 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68db5d1-f29f-4560-b2d5-1638b5ef0a41" containerName="registry-server" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.150285 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-gdljj" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.349307 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f11548d1-8f6e-4de1-98b2-d32acaf5490d-host\") pod \"crc-debug-gdljj\" (UID: \"f11548d1-8f6e-4de1-98b2-d32acaf5490d\") " pod="openshift-must-gather-xh9kw/crc-debug-gdljj" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.349776 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgmww\" (UniqueName: \"kubernetes.io/projected/f11548d1-8f6e-4de1-98b2-d32acaf5490d-kube-api-access-hgmww\") pod \"crc-debug-gdljj\" (UID: \"f11548d1-8f6e-4de1-98b2-d32acaf5490d\") " pod="openshift-must-gather-xh9kw/crc-debug-gdljj" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.398840 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5" path="/var/lib/kubelet/pods/40aa427d-a9ea-42e1-a5d7-2bd19c82bfe5/volumes" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.452206 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f11548d1-8f6e-4de1-98b2-d32acaf5490d-host\") pod \"crc-debug-gdljj\" (UID: \"f11548d1-8f6e-4de1-98b2-d32acaf5490d\") " pod="openshift-must-gather-xh9kw/crc-debug-gdljj" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.452305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgmww\" (UniqueName: \"kubernetes.io/projected/f11548d1-8f6e-4de1-98b2-d32acaf5490d-kube-api-access-hgmww\") pod \"crc-debug-gdljj\" (UID: \"f11548d1-8f6e-4de1-98b2-d32acaf5490d\") " pod="openshift-must-gather-xh9kw/crc-debug-gdljj" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.452559 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f11548d1-8f6e-4de1-98b2-d32acaf5490d-host\") pod \"crc-debug-gdljj\" (UID: \"f11548d1-8f6e-4de1-98b2-d32acaf5490d\") " pod="openshift-must-gather-xh9kw/crc-debug-gdljj" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.474995 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgmww\" (UniqueName: \"kubernetes.io/projected/f11548d1-8f6e-4de1-98b2-d32acaf5490d-kube-api-access-hgmww\") pod \"crc-debug-gdljj\" (UID: \"f11548d1-8f6e-4de1-98b2-d32acaf5490d\") " pod="openshift-must-gather-xh9kw/crc-debug-gdljj" Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.476628 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-gdljj" Jan 21 16:26:23 crc kubenswrapper[4773]: W0121 16:26:23.510653 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf11548d1_8f6e_4de1_98b2_d32acaf5490d.slice/crio-625f27a33dad6734ee9c99b5f527a3d4f670dd9be0ffd7bc70c97324e8a06a3a WatchSource:0}: Error finding container 625f27a33dad6734ee9c99b5f527a3d4f670dd9be0ffd7bc70c97324e8a06a3a: Status 404 returned error can't find the container with id 625f27a33dad6734ee9c99b5f527a3d4f670dd9be0ffd7bc70c97324e8a06a3a Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.830081 4773 generic.go:334] "Generic (PLEG): container finished" podID="f11548d1-8f6e-4de1-98b2-d32acaf5490d" containerID="190cbd5e3956ba6773eeb5d87cd2366a16bcc0c6a1f90da689ec4fb196671ea2" exitCode=0 Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.830294 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh9kw/crc-debug-gdljj" event={"ID":"f11548d1-8f6e-4de1-98b2-d32acaf5490d","Type":"ContainerDied","Data":"190cbd5e3956ba6773eeb5d87cd2366a16bcc0c6a1f90da689ec4fb196671ea2"} Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.830603 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh9kw/crc-debug-gdljj" event={"ID":"f11548d1-8f6e-4de1-98b2-d32acaf5490d","Type":"ContainerStarted","Data":"625f27a33dad6734ee9c99b5f527a3d4f670dd9be0ffd7bc70c97324e8a06a3a"} Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.873421 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xh9kw/crc-debug-gdljj"] Jan 21 16:26:23 crc kubenswrapper[4773]: I0121 16:26:23.884060 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xh9kw/crc-debug-gdljj"] Jan 21 16:26:24 crc kubenswrapper[4773]: I0121 16:26:24.466501 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-nwxrc_d928eb9a-b6dc-4248-9844-54eab0a907fa/cert-manager-controller/0.log" Jan 21 16:26:24 crc kubenswrapper[4773]: I0121 16:26:24.480949 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-ftfc6_f7716ced-8d86-4afa-847b-10feff07e324/cert-manager-cainjector/0.log" Jan 21 16:26:24 crc kubenswrapper[4773]: I0121 16:26:24.491929 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-lsnb5_6f1b2b84-a0ef-43f5-987e-3960271487b8/cert-manager-webhook/0.log" Jan 21 16:26:24 crc kubenswrapper[4773]: I0121 16:26:24.986922 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-gdljj" Jan 21 16:26:25 crc kubenswrapper[4773]: I0121 16:26:25.185784 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f11548d1-8f6e-4de1-98b2-d32acaf5490d-host\") pod \"f11548d1-8f6e-4de1-98b2-d32acaf5490d\" (UID: \"f11548d1-8f6e-4de1-98b2-d32acaf5490d\") " Jan 21 16:26:25 crc kubenswrapper[4773]: I0121 16:26:25.185964 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f11548d1-8f6e-4de1-98b2-d32acaf5490d-host" (OuterVolumeSpecName: "host") pod "f11548d1-8f6e-4de1-98b2-d32acaf5490d" (UID: "f11548d1-8f6e-4de1-98b2-d32acaf5490d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 16:26:25 crc kubenswrapper[4773]: I0121 16:26:25.186128 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgmww\" (UniqueName: \"kubernetes.io/projected/f11548d1-8f6e-4de1-98b2-d32acaf5490d-kube-api-access-hgmww\") pod \"f11548d1-8f6e-4de1-98b2-d32acaf5490d\" (UID: \"f11548d1-8f6e-4de1-98b2-d32acaf5490d\") " Jan 21 16:26:25 crc kubenswrapper[4773]: I0121 16:26:25.186845 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f11548d1-8f6e-4de1-98b2-d32acaf5490d-host\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:25 crc kubenswrapper[4773]: I0121 16:26:25.210311 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11548d1-8f6e-4de1-98b2-d32acaf5490d-kube-api-access-hgmww" (OuterVolumeSpecName: "kube-api-access-hgmww") pod "f11548d1-8f6e-4de1-98b2-d32acaf5490d" (UID: "f11548d1-8f6e-4de1-98b2-d32acaf5490d"). InnerVolumeSpecName "kube-api-access-hgmww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:26:25 crc kubenswrapper[4773]: I0121 16:26:25.293877 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgmww\" (UniqueName: \"kubernetes.io/projected/f11548d1-8f6e-4de1-98b2-d32acaf5490d-kube-api-access-hgmww\") on node \"crc\" DevicePath \"\"" Jan 21 16:26:25 crc kubenswrapper[4773]: I0121 16:26:25.396506 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11548d1-8f6e-4de1-98b2-d32acaf5490d" path="/var/lib/kubelet/pods/f11548d1-8f6e-4de1-98b2-d32acaf5490d/volumes" Jan 21 16:26:25 crc kubenswrapper[4773]: I0121 16:26:25.851996 4773 scope.go:117] "RemoveContainer" containerID="190cbd5e3956ba6773eeb5d87cd2366a16bcc0c6a1f90da689ec4fb196671ea2" Jan 21 16:26:25 crc kubenswrapper[4773]: I0121 16:26:25.852099 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh9kw/crc-debug-gdljj" Jan 21 16:26:26 crc kubenswrapper[4773]: I0121 16:26:26.383796 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:26:26 crc kubenswrapper[4773]: E0121 16:26:26.384017 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:26:31 crc kubenswrapper[4773]: I0121 16:26:31.665405 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-x2tkc_100f2262-430a-4dd1-a8a2-2cfc06f6e345/nmstate-console-plugin/0.log" Jan 21 16:26:31 crc kubenswrapper[4773]: I0121 16:26:31.683705 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xv8b2_4e6397af-d97a-44cb-8e6d-babc6dab33c4/nmstate-handler/0.log" Jan 21 16:26:31 crc kubenswrapper[4773]: I0121 16:26:31.696388 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-26trv_796aeb37-4f9a-401e-ad8d-5a9da9487e56/nmstate-metrics/0.log" Jan 21 16:26:31 crc kubenswrapper[4773]: I0121 16:26:31.706331 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-26trv_796aeb37-4f9a-401e-ad8d-5a9da9487e56/kube-rbac-proxy/0.log" Jan 21 16:26:31 crc kubenswrapper[4773]: I0121 16:26:31.729492 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-vqt8s_bd44c2ca-cc7f-432a-89b8-02a06428b3c9/nmstate-operator/0.log" Jan 21 16:26:31 crc kubenswrapper[4773]: I0121 16:26:31.757816 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-22t95_7587ffb6-642a-4676-a627-1c77024022b2/nmstate-webhook/0.log" Jan 21 16:26:37 crc kubenswrapper[4773]: I0121 16:26:37.384107 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:26:37 crc kubenswrapper[4773]: E0121 16:26:37.384803 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:26:38 crc kubenswrapper[4773]: I0121 16:26:38.412238 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55d66b9568-sfgsj_e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20/manager/0.log" Jan 21 16:26:38 crc kubenswrapper[4773]: I0121 16:26:38.418238 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55d66b9568-sfgsj_e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20/kube-rbac-proxy/0.log" Jan 21 16:26:44 crc kubenswrapper[4773]: I0121 16:26:44.989238 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-lsjgz_dbe682b1-6f91-4f6c-a43e-8b2520806e28/prometheus-operator/0.log" Jan 21 16:26:44 crc kubenswrapper[4773]: I0121 16:26:44.999394 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_d2451757-b101-4568-87a5-37a165b4a460/prometheus-operator-admission-webhook/0.log" Jan 21 16:26:45 crc kubenswrapper[4773]: I0121 16:26:45.012329 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_0d10d91a-c775-4251-bd46-6034add658e3/prometheus-operator-admission-webhook/0.log" Jan 21 16:26:45 crc kubenswrapper[4773]: I0121 16:26:45.053283 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-649gj_831aa084-8756-4fdc-bc57-38400d4a5650/operator/0.log" Jan 21 16:26:45 crc kubenswrapper[4773]: I0121 16:26:45.066721 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-lnw8m_183b75e6-83ae-40f2-9c03-b2ff4e8959d2/perses-operator/0.log" Jan 21 16:26:50 crc kubenswrapper[4773]: I0121 16:26:50.384369 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:26:50 crc kubenswrapper[4773]: E0121 16:26:50.385381 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:26:51 crc kubenswrapper[4773]: I0121 16:26:51.746948 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-vsxqt_0e29da82-0979-40a0-8b48-4ba06d87fd14/controller/0.log" Jan 21 16:26:51 crc kubenswrapper[4773]: I0121 16:26:51.755361 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-vsxqt_0e29da82-0979-40a0-8b48-4ba06d87fd14/kube-rbac-proxy/0.log" Jan 21 16:26:51 crc kubenswrapper[4773]: I0121 16:26:51.778841 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/controller/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.177659 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/frr/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.196966 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/reloader/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.205344 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/frr-metrics/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.212778 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/kube-rbac-proxy/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.222188 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/kube-rbac-proxy-frr/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.235832 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/cp-frr-files/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.247027 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/cp-reloader/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.256604 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/cp-metrics/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.268486 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-bqp84_1d536346-20d9-48b7-92d2-dd043c7cca4a/frr-k8s-webhook-server/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.306669 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5dcc476cd5-nk2zw_bfc5861c-71cc-4485-8fbd-cc661354fe03/manager/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.334802 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5d664df476-46xlp_222e3273-8373-410d-b8f1-fe19aa307ed5/webhook-server/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.658264 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wptww_0968840f-f0d5-4b41-8f6f-00b88d26758e/speaker/0.log" Jan 21 16:26:53 crc kubenswrapper[4773]: I0121 16:26:53.666820 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wptww_0968840f-f0d5-4b41-8f6f-00b88d26758e/kube-rbac-proxy/0.log" Jan 21 16:26:57 crc kubenswrapper[4773]: I0121 16:26:57.848582 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm_6069f608-c03c-4128-ac35-0b5de3f22145/extract/0.log" Jan 21 16:26:57 crc kubenswrapper[4773]: I0121 16:26:57.860823 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm_6069f608-c03c-4128-ac35-0b5de3f22145/util/0.log" Jan 21 16:26:57 crc kubenswrapper[4773]: I0121 16:26:57.868481 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc8rzqm_6069f608-c03c-4128-ac35-0b5de3f22145/pull/0.log" Jan 21 16:26:57 crc kubenswrapper[4773]: I0121 16:26:57.899328 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw_3c3dbffb-bd58-407b-8b8f-b968770973dd/extract/0.log" Jan 21 16:26:57 crc kubenswrapper[4773]: I0121 16:26:57.905683 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw_3c3dbffb-bd58-407b-8b8f-b968770973dd/util/0.log" Jan 21 16:26:57 crc kubenswrapper[4773]: I0121 16:26:57.917053 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773ksdsw_3c3dbffb-bd58-407b-8b8f-b968770973dd/pull/0.log" Jan 21 16:26:57 crc kubenswrapper[4773]: I0121 16:26:57.928957 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm_26c1eaa2-3f40-480e-addf-1a97073c381c/extract/0.log" Jan 21 16:26:57 crc kubenswrapper[4773]: I0121 16:26:57.937388 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm_26c1eaa2-3f40-480e-addf-1a97073c381c/util/0.log" Jan 21 16:26:57 crc kubenswrapper[4773]: I0121 16:26:57.950142 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138pdxm_26c1eaa2-3f40-480e-addf-1a97073c381c/pull/0.log" Jan 21 16:26:57 crc kubenswrapper[4773]: I0121 16:26:57.980632 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv_5476c84f-13d2-4ef5-8426-0147b15b4899/extract/0.log" Jan 21 16:26:57 crc kubenswrapper[4773]: I0121 16:26:57.987550 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv_5476c84f-13d2-4ef5-8426-0147b15b4899/util/0.log" Jan 21 16:26:58 crc kubenswrapper[4773]: I0121 16:26:58.015146 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084lqwv_5476c84f-13d2-4ef5-8426-0147b15b4899/pull/0.log" Jan 21 16:26:58 crc kubenswrapper[4773]: I0121 16:26:58.794013 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w5rdq_37e909e2-8f8b-47ca-bfef-c71fb0a08533/registry-server/0.log" Jan 21 16:26:58 crc kubenswrapper[4773]: I0121 16:26:58.799885 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w5rdq_37e909e2-8f8b-47ca-bfef-c71fb0a08533/extract-utilities/0.log" Jan 21 16:26:58 crc kubenswrapper[4773]: I0121 16:26:58.806891 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w5rdq_37e909e2-8f8b-47ca-bfef-c71fb0a08533/extract-content/0.log" Jan 21 16:26:59 crc kubenswrapper[4773]: I0121 16:26:59.457893 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wmgvw_a6b345a6-34e6-43c7-899b-5e35c36310c4/registry-server/0.log" Jan 21 16:26:59 crc kubenswrapper[4773]: I0121 16:26:59.469093 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wmgvw_a6b345a6-34e6-43c7-899b-5e35c36310c4/extract-utilities/0.log" Jan 21 16:26:59 crc kubenswrapper[4773]: I0121 16:26:59.475900 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wmgvw_a6b345a6-34e6-43c7-899b-5e35c36310c4/extract-content/0.log" Jan 21 16:26:59 crc kubenswrapper[4773]: I0121 16:26:59.493268 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-hxs87_f51f220a-c9d3-4bb3-938a-72ab3ae24ee7/marketplace-operator/0.log" Jan 21 16:26:59 crc kubenswrapper[4773]: I0121 16:26:59.720384 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nph5h_520f1326-f825-4ac0-90dd-a02f8dc8756d/registry-server/0.log" Jan 21 16:26:59 crc kubenswrapper[4773]: I0121 16:26:59.729239 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nph5h_520f1326-f825-4ac0-90dd-a02f8dc8756d/extract-utilities/0.log" Jan 21 16:26:59 crc kubenswrapper[4773]: I0121 16:26:59.734167 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nph5h_520f1326-f825-4ac0-90dd-a02f8dc8756d/extract-content/0.log" Jan 21 16:27:00 crc kubenswrapper[4773]: I0121 16:27:00.244557 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bn8fn_a70f2296-498b-4347-b80d-1d26a02d7d93/registry-server/0.log" Jan 21 16:27:00 crc kubenswrapper[4773]: I0121 16:27:00.253318 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bn8fn_a70f2296-498b-4347-b80d-1d26a02d7d93/extract-utilities/0.log" Jan 21 16:27:00 crc kubenswrapper[4773]: I0121 16:27:00.260242 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bn8fn_a70f2296-498b-4347-b80d-1d26a02d7d93/extract-content/0.log" Jan 21 16:27:01 crc kubenswrapper[4773]: I0121 16:27:01.384347 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:27:01 crc kubenswrapper[4773]: E0121 16:27:01.384673 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:27:04 crc kubenswrapper[4773]: I0121 16:27:04.414838 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-lsjgz_dbe682b1-6f91-4f6c-a43e-8b2520806e28/prometheus-operator/0.log" Jan 21 16:27:04 crc kubenswrapper[4773]: I0121 16:27:04.425580 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_d2451757-b101-4568-87a5-37a165b4a460/prometheus-operator-admission-webhook/0.log" Jan 21 16:27:04 crc kubenswrapper[4773]: I0121 16:27:04.435350 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_0d10d91a-c775-4251-bd46-6034add658e3/prometheus-operator-admission-webhook/0.log" Jan 21 16:27:04 crc kubenswrapper[4773]: I0121 16:27:04.464447 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-649gj_831aa084-8756-4fdc-bc57-38400d4a5650/operator/0.log" Jan 21 16:27:04 crc kubenswrapper[4773]: I0121 16:27:04.478294 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-lnw8m_183b75e6-83ae-40f2-9c03-b2ff4e8959d2/perses-operator/0.log" Jan 21 16:27:11 crc kubenswrapper[4773]: I0121 16:27:11.056010 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55d66b9568-sfgsj_e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20/manager/0.log" Jan 21 16:27:11 crc kubenswrapper[4773]: I0121 16:27:11.063443 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55d66b9568-sfgsj_e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20/kube-rbac-proxy/0.log" Jan 21 16:27:13 crc kubenswrapper[4773]: I0121 16:27:13.384051 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:27:13 crc kubenswrapper[4773]: E0121 16:27:13.384598 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:27:26 crc kubenswrapper[4773]: I0121 16:27:26.384552 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:27:26 crc kubenswrapper[4773]: E0121 16:27:26.385402 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:27:37 crc kubenswrapper[4773]: I0121 16:27:37.386851 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:27:37 crc kubenswrapper[4773]: E0121 16:27:37.387772 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:27:48 crc kubenswrapper[4773]: I0121 16:27:48.384570 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:27:48 crc kubenswrapper[4773]: E0121 16:27:48.385427 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:28:02 crc kubenswrapper[4773]: I0121 16:28:02.385451 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:28:02 crc kubenswrapper[4773]: E0121 16:28:02.386334 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:28:13 crc kubenswrapper[4773]: I0121 16:28:13.383589 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:28:13 crc kubenswrapper[4773]: E0121 16:28:13.384348 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.723312 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qgzz6"] Jan 21 16:28:22 crc kubenswrapper[4773]: E0121 16:28:22.724305 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11548d1-8f6e-4de1-98b2-d32acaf5490d" containerName="container-00" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.724321 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11548d1-8f6e-4de1-98b2-d32acaf5490d" containerName="container-00" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.724579 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11548d1-8f6e-4de1-98b2-d32acaf5490d" containerName="container-00" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.726125 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.733754 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qgzz6"] Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.774071 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-catalog-content\") pod \"redhat-operators-qgzz6\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.774134 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhq7r\" (UniqueName: \"kubernetes.io/projected/427dd06b-5ea4-4970-b00d-497ac9164ef3-kube-api-access-hhq7r\") pod \"redhat-operators-qgzz6\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.774176 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-utilities\") pod \"redhat-operators-qgzz6\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.876035 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhq7r\" (UniqueName: \"kubernetes.io/projected/427dd06b-5ea4-4970-b00d-497ac9164ef3-kube-api-access-hhq7r\") pod \"redhat-operators-qgzz6\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.876079 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-utilities\") pod \"redhat-operators-qgzz6\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.876316 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-catalog-content\") pod \"redhat-operators-qgzz6\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.876716 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-utilities\") pod \"redhat-operators-qgzz6\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.876751 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-catalog-content\") pod \"redhat-operators-qgzz6\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:22 crc kubenswrapper[4773]: I0121 16:28:22.914640 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhq7r\" (UniqueName: \"kubernetes.io/projected/427dd06b-5ea4-4970-b00d-497ac9164ef3-kube-api-access-hhq7r\") pod \"redhat-operators-qgzz6\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:23 crc kubenswrapper[4773]: I0121 16:28:23.049891 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:23 crc kubenswrapper[4773]: I0121 16:28:23.703401 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qgzz6"] Jan 21 16:28:24 crc kubenswrapper[4773]: I0121 16:28:24.006209 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgzz6" event={"ID":"427dd06b-5ea4-4970-b00d-497ac9164ef3","Type":"ContainerStarted","Data":"9de2cdd63c52aa71b784b43521cae33d3a7cb014837e66b09b5a2ca31dc45509"} Jan 21 16:28:24 crc kubenswrapper[4773]: I0121 16:28:24.006877 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgzz6" event={"ID":"427dd06b-5ea4-4970-b00d-497ac9164ef3","Type":"ContainerStarted","Data":"1f644d2ef26ffcf2c6bee462925b3151143d1b89abb4b2d8a7ec180abf972b20"} Jan 21 16:28:25 crc kubenswrapper[4773]: I0121 16:28:25.016705 4773 generic.go:334] "Generic (PLEG): container finished" podID="427dd06b-5ea4-4970-b00d-497ac9164ef3" containerID="9de2cdd63c52aa71b784b43521cae33d3a7cb014837e66b09b5a2ca31dc45509" exitCode=0 Jan 21 16:28:25 crc kubenswrapper[4773]: I0121 16:28:25.016944 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgzz6" event={"ID":"427dd06b-5ea4-4970-b00d-497ac9164ef3","Type":"ContainerDied","Data":"9de2cdd63c52aa71b784b43521cae33d3a7cb014837e66b09b5a2ca31dc45509"} Jan 21 16:28:25 crc kubenswrapper[4773]: I0121 16:28:25.021069 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:28:26 crc kubenswrapper[4773]: I0121 16:28:26.028533 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgzz6" event={"ID":"427dd06b-5ea4-4970-b00d-497ac9164ef3","Type":"ContainerStarted","Data":"ede3ef829c498f9bb7e47d790d43e6ee81feb7fd4e467b13e1720803d5b2d5a3"} Jan 21 16:28:26 crc kubenswrapper[4773]: I0121 16:28:26.383411 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:28:26 crc kubenswrapper[4773]: E0121 16:28:26.383962 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:28:34 crc kubenswrapper[4773]: I0121 16:28:34.129800 4773 generic.go:334] "Generic (PLEG): container finished" podID="427dd06b-5ea4-4970-b00d-497ac9164ef3" containerID="ede3ef829c498f9bb7e47d790d43e6ee81feb7fd4e467b13e1720803d5b2d5a3" exitCode=0 Jan 21 16:28:34 crc kubenswrapper[4773]: I0121 16:28:34.130041 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgzz6" event={"ID":"427dd06b-5ea4-4970-b00d-497ac9164ef3","Type":"ContainerDied","Data":"ede3ef829c498f9bb7e47d790d43e6ee81feb7fd4e467b13e1720803d5b2d5a3"} Jan 21 16:28:35 crc kubenswrapper[4773]: I0121 16:28:35.141826 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgzz6" event={"ID":"427dd06b-5ea4-4970-b00d-497ac9164ef3","Type":"ContainerStarted","Data":"939505aa34daf8835bb40be9030b8304bcf5525e1c0d0e1f87d9986e926e9a20"} Jan 21 16:28:40 crc kubenswrapper[4773]: I0121 16:28:40.384366 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:28:40 crc kubenswrapper[4773]: E0121 16:28:40.385255 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:28:43 crc kubenswrapper[4773]: I0121 16:28:43.050744 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:43 crc kubenswrapper[4773]: I0121 16:28:43.051279 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:43 crc kubenswrapper[4773]: I0121 16:28:43.110553 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:43 crc kubenswrapper[4773]: I0121 16:28:43.140987 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qgzz6" podStartSLOduration=11.315802909 podStartE2EDuration="21.140962897s" podCreationTimestamp="2026-01-21 16:28:22 +0000 UTC" firstStartedPulling="2026-01-21 16:28:25.020846934 +0000 UTC m=+3869.945336556" lastFinishedPulling="2026-01-21 16:28:34.846006922 +0000 UTC m=+3879.770496544" observedRunningTime="2026-01-21 16:28:35.162373233 +0000 UTC m=+3880.086862855" watchObservedRunningTime="2026-01-21 16:28:43.140962897 +0000 UTC m=+3888.065452519" Jan 21 16:28:43 crc kubenswrapper[4773]: I0121 16:28:43.299940 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:43 crc kubenswrapper[4773]: I0121 16:28:43.354212 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qgzz6"] Jan 21 16:28:45 crc kubenswrapper[4773]: I0121 16:28:45.268603 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qgzz6" podUID="427dd06b-5ea4-4970-b00d-497ac9164ef3" containerName="registry-server" containerID="cri-o://939505aa34daf8835bb40be9030b8304bcf5525e1c0d0e1f87d9986e926e9a20" gracePeriod=2 Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.278464 4773 generic.go:334] "Generic (PLEG): container finished" podID="427dd06b-5ea4-4970-b00d-497ac9164ef3" containerID="939505aa34daf8835bb40be9030b8304bcf5525e1c0d0e1f87d9986e926e9a20" exitCode=0 Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.278543 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgzz6" event={"ID":"427dd06b-5ea4-4970-b00d-497ac9164ef3","Type":"ContainerDied","Data":"939505aa34daf8835bb40be9030b8304bcf5525e1c0d0e1f87d9986e926e9a20"} Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.278774 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgzz6" event={"ID":"427dd06b-5ea4-4970-b00d-497ac9164ef3","Type":"ContainerDied","Data":"1f644d2ef26ffcf2c6bee462925b3151143d1b89abb4b2d8a7ec180abf972b20"} Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.278789 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f644d2ef26ffcf2c6bee462925b3151143d1b89abb4b2d8a7ec180abf972b20" Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.283555 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.445834 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-catalog-content\") pod \"427dd06b-5ea4-4970-b00d-497ac9164ef3\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.445941 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-utilities\") pod \"427dd06b-5ea4-4970-b00d-497ac9164ef3\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.446027 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhq7r\" (UniqueName: \"kubernetes.io/projected/427dd06b-5ea4-4970-b00d-497ac9164ef3-kube-api-access-hhq7r\") pod \"427dd06b-5ea4-4970-b00d-497ac9164ef3\" (UID: \"427dd06b-5ea4-4970-b00d-497ac9164ef3\") " Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.446822 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-utilities" (OuterVolumeSpecName: "utilities") pod "427dd06b-5ea4-4970-b00d-497ac9164ef3" (UID: "427dd06b-5ea4-4970-b00d-497ac9164ef3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.463920 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427dd06b-5ea4-4970-b00d-497ac9164ef3-kube-api-access-hhq7r" (OuterVolumeSpecName: "kube-api-access-hhq7r") pod "427dd06b-5ea4-4970-b00d-497ac9164ef3" (UID: "427dd06b-5ea4-4970-b00d-497ac9164ef3"). InnerVolumeSpecName "kube-api-access-hhq7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.550241 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.550294 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhq7r\" (UniqueName: \"kubernetes.io/projected/427dd06b-5ea4-4970-b00d-497ac9164ef3-kube-api-access-hhq7r\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.623727 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "427dd06b-5ea4-4970-b00d-497ac9164ef3" (UID: "427dd06b-5ea4-4970-b00d-497ac9164ef3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:28:46 crc kubenswrapper[4773]: I0121 16:28:46.652337 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/427dd06b-5ea4-4970-b00d-497ac9164ef3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:28:47 crc kubenswrapper[4773]: I0121 16:28:47.290574 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgzz6" Jan 21 16:28:47 crc kubenswrapper[4773]: I0121 16:28:47.340030 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qgzz6"] Jan 21 16:28:47 crc kubenswrapper[4773]: I0121 16:28:47.369743 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qgzz6"] Jan 21 16:28:47 crc kubenswrapper[4773]: I0121 16:28:47.414669 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="427dd06b-5ea4-4970-b00d-497ac9164ef3" path="/var/lib/kubelet/pods/427dd06b-5ea4-4970-b00d-497ac9164ef3/volumes" Jan 21 16:28:55 crc kubenswrapper[4773]: I0121 16:28:55.401229 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:28:55 crc kubenswrapper[4773]: E0121 16:28:55.402071 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:28:59 crc kubenswrapper[4773]: I0121 16:28:59.111138 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-lsjgz_dbe682b1-6f91-4f6c-a43e-8b2520806e28/prometheus-operator/0.log" Jan 21 16:28:59 crc kubenswrapper[4773]: I0121 16:28:59.126038 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6988787b4c-jxdmj_d2451757-b101-4568-87a5-37a165b4a460/prometheus-operator-admission-webhook/0.log" Jan 21 16:28:59 crc kubenswrapper[4773]: I0121 16:28:59.157082 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6988787b4c-lmk26_0d10d91a-c775-4251-bd46-6034add658e3/prometheus-operator-admission-webhook/0.log" Jan 21 16:28:59 crc kubenswrapper[4773]: I0121 16:28:59.199166 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-649gj_831aa084-8756-4fdc-bc57-38400d4a5650/operator/0.log" Jan 21 16:28:59 crc kubenswrapper[4773]: I0121 16:28:59.224364 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-lnw8m_183b75e6-83ae-40f2-9c03-b2ff4e8959d2/perses-operator/0.log" Jan 21 16:28:59 crc kubenswrapper[4773]: I0121 16:28:59.645264 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-nwxrc_d928eb9a-b6dc-4248-9844-54eab0a907fa/cert-manager-controller/0.log" Jan 21 16:28:59 crc kubenswrapper[4773]: I0121 16:28:59.666947 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-ftfc6_f7716ced-8d86-4afa-847b-10feff07e324/cert-manager-cainjector/0.log" Jan 21 16:28:59 crc kubenswrapper[4773]: I0121 16:28:59.677747 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-lsnb5_6f1b2b84-a0ef-43f5-987e-3960271487b8/cert-manager-webhook/0.log" Jan 21 16:29:01 crc kubenswrapper[4773]: I0121 16:29:01.586769 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-vsxqt_0e29da82-0979-40a0-8b48-4ba06d87fd14/controller/0.log" Jan 21 16:29:01 crc kubenswrapper[4773]: I0121 16:29:01.594297 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-vsxqt_0e29da82-0979-40a0-8b48-4ba06d87fd14/kube-rbac-proxy/0.log" Jan 21 16:29:01 crc kubenswrapper[4773]: I0121 16:29:01.625598 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/controller/0.log" Jan 21 16:29:01 crc kubenswrapper[4773]: I0121 16:29:01.741185 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5_a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44/extract/0.log" Jan 21 16:29:01 crc kubenswrapper[4773]: I0121 16:29:01.750734 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5_a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44/util/0.log" Jan 21 16:29:01 crc kubenswrapper[4773]: I0121 16:29:01.763603 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5_a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44/pull/0.log" Jan 21 16:29:01 crc kubenswrapper[4773]: I0121 16:29:01.882305 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-h8g4t_330099a5-d43b-482e-a4cb-e6c3bb2c6706/manager/0.log" Jan 21 16:29:01 crc kubenswrapper[4773]: I0121 16:29:01.956114 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-lvnp8_8678664b-38d8-4482-ae3d-fa1a74a709fd/manager/0.log" Jan 21 16:29:01 crc kubenswrapper[4773]: I0121 16:29:01.976512 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-s7bsl_516d7adc-2317-406b-92ef-6ed5a74a74b3/manager/0.log" Jan 21 16:29:02 crc kubenswrapper[4773]: I0121 16:29:02.087591 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-7kgnl_ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c/manager/0.log" Jan 21 16:29:02 crc kubenswrapper[4773]: I0121 16:29:02.098644 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-hpqt5_8a54ffaf-3268-4696-952e-ee6381310628/manager/0.log" Jan 21 16:29:02 crc kubenswrapper[4773]: I0121 16:29:02.112438 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-wrww2_50f1e60f-1194-428b-b7e2-ccf0ebb384c7/manager/0.log" Jan 21 16:29:02 crc kubenswrapper[4773]: I0121 16:29:02.622779 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-pldbp_fdfe2fce-12c1-4026-b40f-77234a609986/manager/0.log" Jan 21 16:29:02 crc kubenswrapper[4773]: I0121 16:29:02.660826 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-c79x5_32f1de73-4ee0-4eda-8709-d1642d8452f2/manager/0.log" Jan 21 16:29:02 crc kubenswrapper[4773]: I0121 16:29:02.815915 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-fcbjv_d60c449c-a583-4b8e-8265-9df068220041/manager/0.log" Jan 21 16:29:02 crc kubenswrapper[4773]: I0121 16:29:02.835915 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-xdcfb_2264dd36-5855-49cc-bf31-1d1e9dcb1f9f/manager/0.log" Jan 21 16:29:02 crc kubenswrapper[4773]: I0121 16:29:02.926406 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-jz6m4_43aaba2b-296a-407d-9ea2-bbf4c05e868e/manager/0.log" Jan 21 16:29:02 crc kubenswrapper[4773]: I0121 16:29:02.992953 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-d9k6n_9a48a802-404e-4a60-821b-8b91a4830da8/manager/0.log" Jan 21 16:29:03 crc kubenswrapper[4773]: I0121 16:29:03.115531 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-nwvmf_b7286f1c-434c-4ebb-9d2a-54a6596a63b5/manager/0.log" Jan 21 16:29:03 crc kubenswrapper[4773]: I0121 16:29:03.134051 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-kn644_ddac57c3-b102-4cfc-8b1e-53de342cef39/manager/0.log" Jan 21 16:29:03 crc kubenswrapper[4773]: I0121 16:29:03.160538 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz_d8cb173b-7eaa-4183-8028-0a1c4730097c/manager/0.log" Jan 21 16:29:03 crc kubenswrapper[4773]: I0121 16:29:03.410442 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-57bcf57cd7-v96fc_baf015b3-f5b5-4467-8469-bccd49ba94ae/operator/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.589815 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/frr/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.602651 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/reloader/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.607201 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/frr-metrics/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.620070 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/kube-rbac-proxy/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.625071 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/kube-rbac-proxy-frr/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.632656 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/cp-frr-files/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.639552 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/cp-reloader/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.645430 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dtvhb_e336cc2c-2e1a-4d7b-b516-bc360cee8c4b/cp-metrics/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.666396 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-bqp84_1d536346-20d9-48b7-92d2-dd043c7cca4a/frr-k8s-webhook-server/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.698229 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5dcc476cd5-nk2zw_bfc5861c-71cc-4485-8fbd-cc661354fe03/manager/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.710917 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5d664df476-46xlp_222e3273-8373-410d-b8f1-fe19aa307ed5/webhook-server/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.865074 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7c6777f5bd-z474b_83918de1-f089-46b5-99e4-b249fbe09d65/manager/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.883232 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tzlwr_219ae24e-95b5-4a93-b89b-335ef51b2166/registry-server/0.log" Jan 21 16:29:04 crc kubenswrapper[4773]: I0121 16:29:04.963552 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-cn7gp_0f906590-d519-4724-bc67-05c6b3a9191d/manager/0.log" Jan 21 16:29:05 crc kubenswrapper[4773]: I0121 16:29:05.026526 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-2bt26_6411a5d3-7b7b-4735-b01c-7c4aa0d5509c/manager/0.log" Jan 21 16:29:05 crc kubenswrapper[4773]: I0121 16:29:05.071877 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dp2s7_02b38141-e855-4eeb-ac52-d135fb5f44f7/operator/0.log" Jan 21 16:29:05 crc kubenswrapper[4773]: I0121 16:29:05.113745 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-xs55j_7ab140ad-f64b-45e3-a393-f66567e98a9f/manager/0.log" Jan 21 16:29:05 crc kubenswrapper[4773]: I0121 16:29:05.254793 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wptww_0968840f-f0d5-4b41-8f6f-00b88d26758e/speaker/0.log" Jan 21 16:29:05 crc kubenswrapper[4773]: I0121 16:29:05.346235 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wptww_0968840f-f0d5-4b41-8f6f-00b88d26758e/kube-rbac-proxy/0.log" Jan 21 16:29:05 crc kubenswrapper[4773]: I0121 16:29:05.677819 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5c4ff57dc8-78tss_4a9d0079-9636-4913-95fd-305e8d54280d/manager/0.log" Jan 21 16:29:05 crc kubenswrapper[4773]: I0121 16:29:05.699977 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-94knv_eeb5c272-4544-47a4-8d08-187872fea7bd/manager/0.log" Jan 21 16:29:05 crc kubenswrapper[4773]: I0121 16:29:05.732887 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-2wbz9_d7b52472-5c30-471f-a937-c50d96103339/manager/0.log" Jan 21 16:29:06 crc kubenswrapper[4773]: I0121 16:29:06.519003 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-nwxrc_d928eb9a-b6dc-4248-9844-54eab0a907fa/cert-manager-controller/0.log" Jan 21 16:29:06 crc kubenswrapper[4773]: I0121 16:29:06.532457 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-ftfc6_f7716ced-8d86-4afa-847b-10feff07e324/cert-manager-cainjector/0.log" Jan 21 16:29:06 crc kubenswrapper[4773]: I0121 16:29:06.546036 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-lsnb5_6f1b2b84-a0ef-43f5-987e-3960271487b8/cert-manager-webhook/0.log" Jan 21 16:29:07 crc kubenswrapper[4773]: I0121 16:29:07.334764 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-x2tkc_100f2262-430a-4dd1-a8a2-2cfc06f6e345/nmstate-console-plugin/0.log" Jan 21 16:29:07 crc kubenswrapper[4773]: I0121 16:29:07.349051 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xv8b2_4e6397af-d97a-44cb-8e6d-babc6dab33c4/nmstate-handler/0.log" Jan 21 16:29:07 crc kubenswrapper[4773]: I0121 16:29:07.374364 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-26trv_796aeb37-4f9a-401e-ad8d-5a9da9487e56/nmstate-metrics/0.log" Jan 21 16:29:07 crc kubenswrapper[4773]: I0121 16:29:07.383420 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:29:07 crc kubenswrapper[4773]: E0121 16:29:07.383679 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:29:07 crc kubenswrapper[4773]: I0121 16:29:07.383874 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-26trv_796aeb37-4f9a-401e-ad8d-5a9da9487e56/kube-rbac-proxy/0.log" Jan 21 16:29:07 crc kubenswrapper[4773]: I0121 16:29:07.444879 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7pqk4_cafc9bd5-4993-4fcf-ba6d-91028b10e7e8/control-plane-machine-set-operator/0.log" Jan 21 16:29:07 crc kubenswrapper[4773]: I0121 16:29:07.463227 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t6c6p_569e4aab-6b67-4448-9e6e-ecab14ebc87e/kube-rbac-proxy/0.log" Jan 21 16:29:07 crc kubenswrapper[4773]: I0121 16:29:07.592217 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t6c6p_569e4aab-6b67-4448-9e6e-ecab14ebc87e/machine-api-operator/0.log" Jan 21 16:29:07 crc kubenswrapper[4773]: I0121 16:29:07.592873 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-vqt8s_bd44c2ca-cc7f-432a-89b8-02a06428b3c9/nmstate-operator/0.log" Jan 21 16:29:07 crc kubenswrapper[4773]: I0121 16:29:07.605510 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-22t95_7587ffb6-642a-4676-a627-1c77024022b2/nmstate-webhook/0.log" Jan 21 16:29:08 crc kubenswrapper[4773]: I0121 16:29:08.638913 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5_a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44/extract/0.log" Jan 21 16:29:08 crc kubenswrapper[4773]: I0121 16:29:08.645162 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5_a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44/util/0.log" Jan 21 16:29:08 crc kubenswrapper[4773]: I0121 16:29:08.654625 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2451f2e45dad88e0714d2eb8527bf81a2661f00627369f7caf06c105b2fzjh5_a1dc0aee-f254-4a35-b2d3-a2acbb3c8c44/pull/0.log" Jan 21 16:29:08 crc kubenswrapper[4773]: I0121 16:29:08.747343 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-h8g4t_330099a5-d43b-482e-a4cb-e6c3bb2c6706/manager/0.log" Jan 21 16:29:08 crc kubenswrapper[4773]: I0121 16:29:08.813384 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-lvnp8_8678664b-38d8-4482-ae3d-fa1a74a709fd/manager/0.log" Jan 21 16:29:08 crc kubenswrapper[4773]: I0121 16:29:08.826446 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-s7bsl_516d7adc-2317-406b-92ef-6ed5a74a74b3/manager/0.log" Jan 21 16:29:08 crc kubenswrapper[4773]: I0121 16:29:08.915075 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-7kgnl_ce51f4fd-fd32-4d8f-bb25-c5a7d4b4680c/manager/0.log" Jan 21 16:29:08 crc kubenswrapper[4773]: I0121 16:29:08.934037 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-hpqt5_8a54ffaf-3268-4696-952e-ee6381310628/manager/0.log" Jan 21 16:29:08 crc kubenswrapper[4773]: I0121 16:29:08.963926 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-wrww2_50f1e60f-1194-428b-b7e2-ccf0ebb384c7/manager/0.log" Jan 21 16:29:09 crc kubenswrapper[4773]: I0121 16:29:09.267203 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-pldbp_fdfe2fce-12c1-4026-b40f-77234a609986/manager/0.log" Jan 21 16:29:09 crc kubenswrapper[4773]: I0121 16:29:09.278825 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-c79x5_32f1de73-4ee0-4eda-8709-d1642d8452f2/manager/0.log" Jan 21 16:29:09 crc kubenswrapper[4773]: I0121 16:29:09.345138 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-fcbjv_d60c449c-a583-4b8e-8265-9df068220041/manager/0.log" Jan 21 16:29:09 crc kubenswrapper[4773]: I0121 16:29:09.356114 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-xdcfb_2264dd36-5855-49cc-bf31-1d1e9dcb1f9f/manager/0.log" Jan 21 16:29:09 crc kubenswrapper[4773]: I0121 16:29:09.401339 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-jz6m4_43aaba2b-296a-407d-9ea2-bbf4c05e868e/manager/0.log" Jan 21 16:29:09 crc kubenswrapper[4773]: I0121 16:29:09.476467 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-d9k6n_9a48a802-404e-4a60-821b-8b91a4830da8/manager/0.log" Jan 21 16:29:09 crc kubenswrapper[4773]: I0121 16:29:09.566830 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-nwvmf_b7286f1c-434c-4ebb-9d2a-54a6596a63b5/manager/0.log" Jan 21 16:29:09 crc kubenswrapper[4773]: I0121 16:29:09.584454 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-kn644_ddac57c3-b102-4cfc-8b1e-53de342cef39/manager/0.log" Jan 21 16:29:09 crc kubenswrapper[4773]: I0121 16:29:09.599568 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz_d8cb173b-7eaa-4183-8028-0a1c4730097c/manager/0.log" Jan 21 16:29:09 crc kubenswrapper[4773]: I0121 16:29:09.749683 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-57bcf57cd7-v96fc_baf015b3-f5b5-4467-8469-bccd49ba94ae/operator/0.log" Jan 21 16:29:11 crc kubenswrapper[4773]: I0121 16:29:11.046323 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7c6777f5bd-z474b_83918de1-f089-46b5-99e4-b249fbe09d65/manager/0.log" Jan 21 16:29:11 crc kubenswrapper[4773]: I0121 16:29:11.062221 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tzlwr_219ae24e-95b5-4a93-b89b-335ef51b2166/registry-server/0.log" Jan 21 16:29:11 crc kubenswrapper[4773]: I0121 16:29:11.124037 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-cn7gp_0f906590-d519-4724-bc67-05c6b3a9191d/manager/0.log" Jan 21 16:29:11 crc kubenswrapper[4773]: I0121 16:29:11.153975 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-2bt26_6411a5d3-7b7b-4735-b01c-7c4aa0d5509c/manager/0.log" Jan 21 16:29:11 crc kubenswrapper[4773]: I0121 16:29:11.181260 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dp2s7_02b38141-e855-4eeb-ac52-d135fb5f44f7/operator/0.log" Jan 21 16:29:11 crc kubenswrapper[4773]: I0121 16:29:11.207901 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-xs55j_7ab140ad-f64b-45e3-a393-f66567e98a9f/manager/0.log" Jan 21 16:29:11 crc kubenswrapper[4773]: I0121 16:29:11.602508 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5c4ff57dc8-78tss_4a9d0079-9636-4913-95fd-305e8d54280d/manager/0.log" Jan 21 16:29:11 crc kubenswrapper[4773]: I0121 16:29:11.623647 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-94knv_eeb5c272-4544-47a4-8d08-187872fea7bd/manager/0.log" Jan 21 16:29:11 crc kubenswrapper[4773]: I0121 16:29:11.633229 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-2wbz9_d7b52472-5c30-471f-a937-c50d96103339/manager/0.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.554885 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f67j_7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2/kube-multus-additional-cni-plugins/0.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.568428 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f67j_7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2/egress-router-binary-copy/0.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.577553 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f67j_7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2/cni-plugins/0.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.586062 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f67j_7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2/bond-cni-plugin/0.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.593657 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f67j_7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2/routeoverride-cni/0.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.603813 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f67j_7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2/whereabouts-cni-bincopy/0.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.629091 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f67j_7dcf05b0-62cc-4b1d-a21f-3dca696bf8c2/whereabouts-cni/0.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.660714 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-f5jgd_b347bcd3-0e23-40a4-8e27-9140db184474/multus-admission-controller/0.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.667296 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-f5jgd_b347bcd3-0e23-40a4-8e27-9140db184474/kube-rbac-proxy/0.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.720281 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gc5wj_34d54fdd-eda0-441f-b721-0adecc20a0db/kube-multus/1.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.783950 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gc5wj_34d54fdd-eda0-441f-b721-0adecc20a0db/kube-multus/2.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.817902 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8n66g_1a01fed4-2691-453e-b74f-c000d5125b53/network-metrics-daemon/0.log" Jan 21 16:29:13 crc kubenswrapper[4773]: I0121 16:29:13.827663 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8n66g_1a01fed4-2691-453e-b74f-c000d5125b53/kube-rbac-proxy/0.log" Jan 21 16:29:14 crc kubenswrapper[4773]: I0121 16:29:14.845105 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55d66b9568-sfgsj_e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20/manager/0.log" Jan 21 16:29:14 crc kubenswrapper[4773]: I0121 16:29:14.851312 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55d66b9568-sfgsj_e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20/kube-rbac-proxy/0.log" Jan 21 16:29:22 crc kubenswrapper[4773]: I0121 16:29:22.384270 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:29:22 crc kubenswrapper[4773]: E0121 16:29:22.385228 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:29:33 crc kubenswrapper[4773]: I0121 16:29:33.384012 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:29:33 crc kubenswrapper[4773]: E0121 16:29:33.384707 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:29:47 crc kubenswrapper[4773]: I0121 16:29:47.383598 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:29:47 crc kubenswrapper[4773]: E0121 16:29:47.384354 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.226585 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j"] Jan 21 16:30:00 crc kubenswrapper[4773]: E0121 16:30:00.227737 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427dd06b-5ea4-4970-b00d-497ac9164ef3" containerName="registry-server" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.227755 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="427dd06b-5ea4-4970-b00d-497ac9164ef3" containerName="registry-server" Jan 21 16:30:00 crc kubenswrapper[4773]: E0121 16:30:00.227787 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427dd06b-5ea4-4970-b00d-497ac9164ef3" containerName="extract-utilities" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.227796 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="427dd06b-5ea4-4970-b00d-497ac9164ef3" containerName="extract-utilities" Jan 21 16:30:00 crc kubenswrapper[4773]: E0121 16:30:00.227811 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427dd06b-5ea4-4970-b00d-497ac9164ef3" containerName="extract-content" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.227818 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="427dd06b-5ea4-4970-b00d-497ac9164ef3" containerName="extract-content" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.228055 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="427dd06b-5ea4-4970-b00d-497ac9164ef3" containerName="registry-server" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.229047 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.231554 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.240066 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.265780 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j"] Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.278163 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2085a400-c015-497d-82f2-0d0a90d692bc-config-volume\") pod \"collect-profiles-29483550-l646j\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.278263 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2085a400-c015-497d-82f2-0d0a90d692bc-secret-volume\") pod \"collect-profiles-29483550-l646j\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.278289 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2wz4\" (UniqueName: \"kubernetes.io/projected/2085a400-c015-497d-82f2-0d0a90d692bc-kube-api-access-h2wz4\") pod \"collect-profiles-29483550-l646j\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.380190 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2085a400-c015-497d-82f2-0d0a90d692bc-config-volume\") pod \"collect-profiles-29483550-l646j\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.380275 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2085a400-c015-497d-82f2-0d0a90d692bc-secret-volume\") pod \"collect-profiles-29483550-l646j\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.380307 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2wz4\" (UniqueName: \"kubernetes.io/projected/2085a400-c015-497d-82f2-0d0a90d692bc-kube-api-access-h2wz4\") pod \"collect-profiles-29483550-l646j\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.381728 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2085a400-c015-497d-82f2-0d0a90d692bc-config-volume\") pod \"collect-profiles-29483550-l646j\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.387390 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.709271 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2085a400-c015-497d-82f2-0d0a90d692bc-secret-volume\") pod \"collect-profiles-29483550-l646j\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.715788 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2wz4\" (UniqueName: \"kubernetes.io/projected/2085a400-c015-497d-82f2-0d0a90d692bc-kube-api-access-h2wz4\") pod \"collect-profiles-29483550-l646j\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:00 crc kubenswrapper[4773]: I0121 16:30:00.861089 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:01 crc kubenswrapper[4773]: W0121 16:30:01.396350 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2085a400_c015_497d_82f2_0d0a90d692bc.slice/crio-987f72c176060a9719edbecf17dd643227af986ba9255d343b86485ed2a53d0b WatchSource:0}: Error finding container 987f72c176060a9719edbecf17dd643227af986ba9255d343b86485ed2a53d0b: Status 404 returned error can't find the container with id 987f72c176060a9719edbecf17dd643227af986ba9255d343b86485ed2a53d0b Jan 21 16:30:01 crc kubenswrapper[4773]: I0121 16:30:01.396871 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j"] Jan 21 16:30:02 crc kubenswrapper[4773]: I0121 16:30:02.168051 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"5478b0f01dc74b7043f1932bc410ad3e4eb13ace26ba52d09d1d638ac3c6f2c1"} Jan 21 16:30:02 crc kubenswrapper[4773]: I0121 16:30:02.171208 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" event={"ID":"2085a400-c015-497d-82f2-0d0a90d692bc","Type":"ContainerStarted","Data":"bf684dbc6d3f0604f0aeda0df4024252f81b249c5469fed566ff7fb47cb1c770"} Jan 21 16:30:02 crc kubenswrapper[4773]: I0121 16:30:02.171262 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" event={"ID":"2085a400-c015-497d-82f2-0d0a90d692bc","Type":"ContainerStarted","Data":"987f72c176060a9719edbecf17dd643227af986ba9255d343b86485ed2a53d0b"} Jan 21 16:30:02 crc kubenswrapper[4773]: I0121 16:30:02.228727 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" podStartSLOduration=2.228681544 podStartE2EDuration="2.228681544s" podCreationTimestamp="2026-01-21 16:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:30:02.210631285 +0000 UTC m=+3967.135120907" watchObservedRunningTime="2026-01-21 16:30:02.228681544 +0000 UTC m=+3967.153171166" Jan 21 16:30:04 crc kubenswrapper[4773]: I0121 16:30:03.183794 4773 generic.go:334] "Generic (PLEG): container finished" podID="2085a400-c015-497d-82f2-0d0a90d692bc" containerID="bf684dbc6d3f0604f0aeda0df4024252f81b249c5469fed566ff7fb47cb1c770" exitCode=0 Jan 21 16:30:04 crc kubenswrapper[4773]: I0121 16:30:03.183982 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" event={"ID":"2085a400-c015-497d-82f2-0d0a90d692bc","Type":"ContainerDied","Data":"bf684dbc6d3f0604f0aeda0df4024252f81b249c5469fed566ff7fb47cb1c770"} Jan 21 16:30:04 crc kubenswrapper[4773]: I0121 16:30:04.323613 4773 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-68gt2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 16:30:04 crc kubenswrapper[4773]: I0121 16:30:04.323743 4773 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-68gt2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 16:30:04 crc kubenswrapper[4773]: I0121 16:30:04.324228 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" podUID="0bb536d4-f4ae-44ac-8477-0d14b97ebe04" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 16:30:04 crc kubenswrapper[4773]: I0121 16:30:04.324148 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-68gt2" podUID="0bb536d4-f4ae-44ac-8477-0d14b97ebe04" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.689345 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.716705 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2085a400-c015-497d-82f2-0d0a90d692bc-secret-volume\") pod \"2085a400-c015-497d-82f2-0d0a90d692bc\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.717044 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2wz4\" (UniqueName: \"kubernetes.io/projected/2085a400-c015-497d-82f2-0d0a90d692bc-kube-api-access-h2wz4\") pod \"2085a400-c015-497d-82f2-0d0a90d692bc\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.717101 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2085a400-c015-497d-82f2-0d0a90d692bc-config-volume\") pod \"2085a400-c015-497d-82f2-0d0a90d692bc\" (UID: \"2085a400-c015-497d-82f2-0d0a90d692bc\") " Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.717891 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2085a400-c015-497d-82f2-0d0a90d692bc-config-volume" (OuterVolumeSpecName: "config-volume") pod "2085a400-c015-497d-82f2-0d0a90d692bc" (UID: "2085a400-c015-497d-82f2-0d0a90d692bc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.718070 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2085a400-c015-497d-82f2-0d0a90d692bc-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.734942 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2085a400-c015-497d-82f2-0d0a90d692bc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2085a400-c015-497d-82f2-0d0a90d692bc" (UID: "2085a400-c015-497d-82f2-0d0a90d692bc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.746488 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2085a400-c015-497d-82f2-0d0a90d692bc-kube-api-access-h2wz4" (OuterVolumeSpecName: "kube-api-access-h2wz4") pod "2085a400-c015-497d-82f2-0d0a90d692bc" (UID: "2085a400-c015-497d-82f2-0d0a90d692bc"). InnerVolumeSpecName "kube-api-access-h2wz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.820001 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2wz4\" (UniqueName: \"kubernetes.io/projected/2085a400-c015-497d-82f2-0d0a90d692bc-kube-api-access-h2wz4\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.820331 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2085a400-c015-497d-82f2-0d0a90d692bc-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.948768 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96"] Jan 21 16:30:05 crc kubenswrapper[4773]: I0121 16:30:05.961746 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483505-zgc96"] Jan 21 16:30:06 crc kubenswrapper[4773]: I0121 16:30:06.214445 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" event={"ID":"2085a400-c015-497d-82f2-0d0a90d692bc","Type":"ContainerDied","Data":"987f72c176060a9719edbecf17dd643227af986ba9255d343b86485ed2a53d0b"} Jan 21 16:30:06 crc kubenswrapper[4773]: I0121 16:30:06.214494 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="987f72c176060a9719edbecf17dd643227af986ba9255d343b86485ed2a53d0b" Jan 21 16:30:06 crc kubenswrapper[4773]: I0121 16:30:06.214502 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483550-l646j" Jan 21 16:30:07 crc kubenswrapper[4773]: I0121 16:30:07.405742 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527a62a3-540f-4352-903c-184f60e613a7" path="/var/lib/kubelet/pods/527a62a3-540f-4352-903c-184f60e613a7/volumes" Jan 21 16:30:22 crc kubenswrapper[4773]: I0121 16:30:22.955543 4773 scope.go:117] "RemoveContainer" containerID="349ce91b7ddd85303a5dfbaac856165d8f9cf548169584d19d626175eb2cc750" Jan 21 16:31:23 crc kubenswrapper[4773]: I0121 16:31:23.034271 4773 scope.go:117] "RemoveContainer" containerID="9d5fbe322cf265dac523acc8b611b9867572e5245997c3c4f438d4f0bb6c7830" Jan 21 16:32:23 crc kubenswrapper[4773]: I0121 16:32:23.162266 4773 scope.go:117] "RemoveContainer" containerID="b289e11862c71c164ddd5a6297de58b16e4077e1ef91f7a01dfa8d405156d843" Jan 21 16:32:25 crc kubenswrapper[4773]: I0121 16:32:25.205726 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:32:25 crc kubenswrapper[4773]: I0121 16:32:25.206759 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:32:55 crc kubenswrapper[4773]: I0121 16:32:55.206432 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:32:55 crc kubenswrapper[4773]: I0121 16:32:55.207024 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:33:25 crc kubenswrapper[4773]: I0121 16:33:25.205739 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:33:25 crc kubenswrapper[4773]: I0121 16:33:25.206273 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:33:25 crc kubenswrapper[4773]: I0121 16:33:25.206330 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 16:33:25 crc kubenswrapper[4773]: I0121 16:33:25.207255 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5478b0f01dc74b7043f1932bc410ad3e4eb13ace26ba52d09d1d638ac3c6f2c1"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:33:25 crc kubenswrapper[4773]: I0121 16:33:25.207313 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://5478b0f01dc74b7043f1932bc410ad3e4eb13ace26ba52d09d1d638ac3c6f2c1" gracePeriod=600 Jan 21 16:33:25 crc kubenswrapper[4773]: I0121 16:33:25.474990 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="5478b0f01dc74b7043f1932bc410ad3e4eb13ace26ba52d09d1d638ac3c6f2c1" exitCode=0 Jan 21 16:33:25 crc kubenswrapper[4773]: I0121 16:33:25.475077 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"5478b0f01dc74b7043f1932bc410ad3e4eb13ace26ba52d09d1d638ac3c6f2c1"} Jan 21 16:33:25 crc kubenswrapper[4773]: I0121 16:33:25.475401 4773 scope.go:117] "RemoveContainer" containerID="91ed33829aa7933d272f7b277d51bb0e0ede0da09d71115d405a9fc8435739df" Jan 21 16:33:26 crc kubenswrapper[4773]: I0121 16:33:26.487743 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184"} Jan 21 16:35:23 crc kubenswrapper[4773]: I0121 16:35:23.396121 4773 scope.go:117] "RemoveContainer" containerID="ede3ef829c498f9bb7e47d790d43e6ee81feb7fd4e467b13e1720803d5b2d5a3" Jan 21 16:35:23 crc kubenswrapper[4773]: I0121 16:35:23.420318 4773 scope.go:117] "RemoveContainer" containerID="939505aa34daf8835bb40be9030b8304bcf5525e1c0d0e1f87d9986e926e9a20" Jan 21 16:35:23 crc kubenswrapper[4773]: I0121 16:35:23.469072 4773 scope.go:117] "RemoveContainer" containerID="9de2cdd63c52aa71b784b43521cae33d3a7cb014837e66b09b5a2ca31dc45509" Jan 21 16:35:25 crc kubenswrapper[4773]: I0121 16:35:25.205733 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:35:25 crc kubenswrapper[4773]: I0121 16:35:25.206045 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.459311 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dmwc5"] Jan 21 16:35:27 crc kubenswrapper[4773]: E0121 16:35:27.459954 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2085a400-c015-497d-82f2-0d0a90d692bc" containerName="collect-profiles" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.459966 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2085a400-c015-497d-82f2-0d0a90d692bc" containerName="collect-profiles" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.460170 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2085a400-c015-497d-82f2-0d0a90d692bc" containerName="collect-profiles" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.461679 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.490104 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dmwc5"] Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.582836 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-catalog-content\") pod \"community-operators-dmwc5\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.583298 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9vcx\" (UniqueName: \"kubernetes.io/projected/2787e8f5-fb49-468f-8db7-afe2af0efa5f-kube-api-access-z9vcx\") pod \"community-operators-dmwc5\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.583367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-utilities\") pod \"community-operators-dmwc5\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.686880 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9vcx\" (UniqueName: \"kubernetes.io/projected/2787e8f5-fb49-468f-8db7-afe2af0efa5f-kube-api-access-z9vcx\") pod \"community-operators-dmwc5\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.686923 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-utilities\") pod \"community-operators-dmwc5\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.687032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-catalog-content\") pod \"community-operators-dmwc5\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.687534 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-catalog-content\") pod \"community-operators-dmwc5\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.688119 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-utilities\") pod \"community-operators-dmwc5\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.710849 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9vcx\" (UniqueName: \"kubernetes.io/projected/2787e8f5-fb49-468f-8db7-afe2af0efa5f-kube-api-access-z9vcx\") pod \"community-operators-dmwc5\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:27 crc kubenswrapper[4773]: I0121 16:35:27.795258 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:28 crc kubenswrapper[4773]: I0121 16:35:28.338251 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dmwc5"] Jan 21 16:35:28 crc kubenswrapper[4773]: I0121 16:35:28.673478 4773 generic.go:334] "Generic (PLEG): container finished" podID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" containerID="9e497e938b7b7e5f3ad0403056499b4bd38874ee570b4787209bb9416171191f" exitCode=0 Jan 21 16:35:28 crc kubenswrapper[4773]: I0121 16:35:28.673539 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmwc5" event={"ID":"2787e8f5-fb49-468f-8db7-afe2af0efa5f","Type":"ContainerDied","Data":"9e497e938b7b7e5f3ad0403056499b4bd38874ee570b4787209bb9416171191f"} Jan 21 16:35:28 crc kubenswrapper[4773]: I0121 16:35:28.673596 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmwc5" event={"ID":"2787e8f5-fb49-468f-8db7-afe2af0efa5f","Type":"ContainerStarted","Data":"7e7772068df314e90468ec2fe9b32531a6eaa25a0b22fcb59f497e788d2da58a"} Jan 21 16:35:28 crc kubenswrapper[4773]: I0121 16:35:28.675386 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:35:31 crc kubenswrapper[4773]: I0121 16:35:31.743306 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmwc5" event={"ID":"2787e8f5-fb49-468f-8db7-afe2af0efa5f","Type":"ContainerStarted","Data":"82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac"} Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.459151 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z7xqx"] Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.463073 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.488949 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7xqx"] Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.584065 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-catalog-content\") pod \"certified-operators-z7xqx\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.584265 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-utilities\") pod \"certified-operators-z7xqx\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.584776 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4rt8\" (UniqueName: \"kubernetes.io/projected/9abb9509-5532-46eb-985b-50ba1cb08eb7-kube-api-access-t4rt8\") pod \"certified-operators-z7xqx\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.688286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-utilities\") pod \"certified-operators-z7xqx\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.688960 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-utilities\") pod \"certified-operators-z7xqx\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.688453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4rt8\" (UniqueName: \"kubernetes.io/projected/9abb9509-5532-46eb-985b-50ba1cb08eb7-kube-api-access-t4rt8\") pod \"certified-operators-z7xqx\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.689147 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-catalog-content\") pod \"certified-operators-z7xqx\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.689525 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-catalog-content\") pod \"certified-operators-z7xqx\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.732837 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4rt8\" (UniqueName: \"kubernetes.io/projected/9abb9509-5532-46eb-985b-50ba1cb08eb7-kube-api-access-t4rt8\") pod \"certified-operators-z7xqx\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.759175 4773 generic.go:334] "Generic (PLEG): container finished" podID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" containerID="82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac" exitCode=0 Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.759471 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmwc5" event={"ID":"2787e8f5-fb49-468f-8db7-afe2af0efa5f","Type":"ContainerDied","Data":"82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac"} Jan 21 16:35:32 crc kubenswrapper[4773]: I0121 16:35:32.783838 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:33 crc kubenswrapper[4773]: I0121 16:35:33.373934 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7xqx"] Jan 21 16:35:33 crc kubenswrapper[4773]: W0121 16:35:33.378323 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9abb9509_5532_46eb_985b_50ba1cb08eb7.slice/crio-a5d639e5e53ffaa55dff17fc8db0c1f52315011038912d987f1ff7965252725c WatchSource:0}: Error finding container a5d639e5e53ffaa55dff17fc8db0c1f52315011038912d987f1ff7965252725c: Status 404 returned error can't find the container with id a5d639e5e53ffaa55dff17fc8db0c1f52315011038912d987f1ff7965252725c Jan 21 16:35:33 crc kubenswrapper[4773]: I0121 16:35:33.770785 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xqx" event={"ID":"9abb9509-5532-46eb-985b-50ba1cb08eb7","Type":"ContainerStarted","Data":"a5d639e5e53ffaa55dff17fc8db0c1f52315011038912d987f1ff7965252725c"} Jan 21 16:35:34 crc kubenswrapper[4773]: I0121 16:35:34.782511 4773 generic.go:334] "Generic (PLEG): container finished" podID="9abb9509-5532-46eb-985b-50ba1cb08eb7" containerID="2859d0aa3a86f611e8b8d6ff272a2505ba5d7308bc1c2f38b9abf7b4438c93ab" exitCode=0 Jan 21 16:35:34 crc kubenswrapper[4773]: I0121 16:35:34.782658 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xqx" event={"ID":"9abb9509-5532-46eb-985b-50ba1cb08eb7","Type":"ContainerDied","Data":"2859d0aa3a86f611e8b8d6ff272a2505ba5d7308bc1c2f38b9abf7b4438c93ab"} Jan 21 16:35:34 crc kubenswrapper[4773]: I0121 16:35:34.785494 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmwc5" event={"ID":"2787e8f5-fb49-468f-8db7-afe2af0efa5f","Type":"ContainerStarted","Data":"b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f"} Jan 21 16:35:34 crc kubenswrapper[4773]: I0121 16:35:34.859821 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dmwc5" podStartSLOduration=3.080193472 podStartE2EDuration="7.859802883s" podCreationTimestamp="2026-01-21 16:35:27 +0000 UTC" firstStartedPulling="2026-01-21 16:35:28.675175439 +0000 UTC m=+4293.599665061" lastFinishedPulling="2026-01-21 16:35:33.45478485 +0000 UTC m=+4298.379274472" observedRunningTime="2026-01-21 16:35:34.829989815 +0000 UTC m=+4299.754479437" watchObservedRunningTime="2026-01-21 16:35:34.859802883 +0000 UTC m=+4299.784292505" Jan 21 16:35:35 crc kubenswrapper[4773]: I0121 16:35:35.796724 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xqx" event={"ID":"9abb9509-5532-46eb-985b-50ba1cb08eb7","Type":"ContainerStarted","Data":"644fed542669de7a8461ae32f7a5153904245e1c5322ec1e72d626c7057b5557"} Jan 21 16:35:36 crc kubenswrapper[4773]: I0121 16:35:36.807138 4773 generic.go:334] "Generic (PLEG): container finished" podID="9abb9509-5532-46eb-985b-50ba1cb08eb7" containerID="644fed542669de7a8461ae32f7a5153904245e1c5322ec1e72d626c7057b5557" exitCode=0 Jan 21 16:35:36 crc kubenswrapper[4773]: I0121 16:35:36.807189 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xqx" event={"ID":"9abb9509-5532-46eb-985b-50ba1cb08eb7","Type":"ContainerDied","Data":"644fed542669de7a8461ae32f7a5153904245e1c5322ec1e72d626c7057b5557"} Jan 21 16:35:37 crc kubenswrapper[4773]: I0121 16:35:37.795738 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:37 crc kubenswrapper[4773]: I0121 16:35:37.796100 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:37 crc kubenswrapper[4773]: I0121 16:35:37.858986 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:38 crc kubenswrapper[4773]: I0121 16:35:38.841598 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xqx" event={"ID":"9abb9509-5532-46eb-985b-50ba1cb08eb7","Type":"ContainerStarted","Data":"54c24523c5fcc88dd4f14bd3203006b8244cefc91a8f59ea85d831b75b8d6a89"} Jan 21 16:35:38 crc kubenswrapper[4773]: I0121 16:35:38.870204 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z7xqx" podStartSLOduration=3.838556708 podStartE2EDuration="6.870181396s" podCreationTimestamp="2026-01-21 16:35:32 +0000 UTC" firstStartedPulling="2026-01-21 16:35:34.78587743 +0000 UTC m=+4299.710367052" lastFinishedPulling="2026-01-21 16:35:37.817502118 +0000 UTC m=+4302.741991740" observedRunningTime="2026-01-21 16:35:38.861495791 +0000 UTC m=+4303.785985443" watchObservedRunningTime="2026-01-21 16:35:38.870181396 +0000 UTC m=+4303.794671018" Jan 21 16:35:42 crc kubenswrapper[4773]: I0121 16:35:42.784819 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:42 crc kubenswrapper[4773]: I0121 16:35:42.786285 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:42 crc kubenswrapper[4773]: I0121 16:35:42.830800 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:43 crc kubenswrapper[4773]: I0121 16:35:43.935163 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:45 crc kubenswrapper[4773]: I0121 16:35:45.458824 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7xqx"] Jan 21 16:35:45 crc kubenswrapper[4773]: I0121 16:35:45.906668 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z7xqx" podUID="9abb9509-5532-46eb-985b-50ba1cb08eb7" containerName="registry-server" containerID="cri-o://54c24523c5fcc88dd4f14bd3203006b8244cefc91a8f59ea85d831b75b8d6a89" gracePeriod=2 Jan 21 16:35:46 crc kubenswrapper[4773]: I0121 16:35:46.916141 4773 generic.go:334] "Generic (PLEG): container finished" podID="9abb9509-5532-46eb-985b-50ba1cb08eb7" containerID="54c24523c5fcc88dd4f14bd3203006b8244cefc91a8f59ea85d831b75b8d6a89" exitCode=0 Jan 21 16:35:46 crc kubenswrapper[4773]: I0121 16:35:46.916211 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xqx" event={"ID":"9abb9509-5532-46eb-985b-50ba1cb08eb7","Type":"ContainerDied","Data":"54c24523c5fcc88dd4f14bd3203006b8244cefc91a8f59ea85d831b75b8d6a89"} Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.084632 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.195029 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4rt8\" (UniqueName: \"kubernetes.io/projected/9abb9509-5532-46eb-985b-50ba1cb08eb7-kube-api-access-t4rt8\") pod \"9abb9509-5532-46eb-985b-50ba1cb08eb7\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.195166 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-catalog-content\") pod \"9abb9509-5532-46eb-985b-50ba1cb08eb7\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.195542 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-utilities\") pod \"9abb9509-5532-46eb-985b-50ba1cb08eb7\" (UID: \"9abb9509-5532-46eb-985b-50ba1cb08eb7\") " Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.197306 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-utilities" (OuterVolumeSpecName: "utilities") pod "9abb9509-5532-46eb-985b-50ba1cb08eb7" (UID: "9abb9509-5532-46eb-985b-50ba1cb08eb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.203073 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9abb9509-5532-46eb-985b-50ba1cb08eb7-kube-api-access-t4rt8" (OuterVolumeSpecName: "kube-api-access-t4rt8") pod "9abb9509-5532-46eb-985b-50ba1cb08eb7" (UID: "9abb9509-5532-46eb-985b-50ba1cb08eb7"). InnerVolumeSpecName "kube-api-access-t4rt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.244991 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9abb9509-5532-46eb-985b-50ba1cb08eb7" (UID: "9abb9509-5532-46eb-985b-50ba1cb08eb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.297585 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4rt8\" (UniqueName: \"kubernetes.io/projected/9abb9509-5532-46eb-985b-50ba1cb08eb7-kube-api-access-t4rt8\") on node \"crc\" DevicePath \"\"" Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.297635 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.297645 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abb9509-5532-46eb-985b-50ba1cb08eb7-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.849439 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.929592 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xqx" event={"ID":"9abb9509-5532-46eb-985b-50ba1cb08eb7","Type":"ContainerDied","Data":"a5d639e5e53ffaa55dff17fc8db0c1f52315011038912d987f1ff7965252725c"} Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.929657 4773 scope.go:117] "RemoveContainer" containerID="54c24523c5fcc88dd4f14bd3203006b8244cefc91a8f59ea85d831b75b8d6a89" Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.929846 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7xqx" Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.966639 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7xqx"] Jan 21 16:35:47 crc kubenswrapper[4773]: I0121 16:35:47.999380 4773 scope.go:117] "RemoveContainer" containerID="644fed542669de7a8461ae32f7a5153904245e1c5322ec1e72d626c7057b5557" Jan 21 16:35:48 crc kubenswrapper[4773]: I0121 16:35:48.018685 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z7xqx"] Jan 21 16:35:48 crc kubenswrapper[4773]: I0121 16:35:48.021088 4773 scope.go:117] "RemoveContainer" containerID="2859d0aa3a86f611e8b8d6ff272a2505ba5d7308bc1c2f38b9abf7b4438c93ab" Jan 21 16:35:49 crc kubenswrapper[4773]: I0121 16:35:49.395595 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9abb9509-5532-46eb-985b-50ba1cb08eb7" path="/var/lib/kubelet/pods/9abb9509-5532-46eb-985b-50ba1cb08eb7/volumes" Jan 21 16:35:50 crc kubenswrapper[4773]: I0121 16:35:50.848379 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dmwc5"] Jan 21 16:35:50 crc kubenswrapper[4773]: I0121 16:35:50.848892 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dmwc5" podUID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" containerName="registry-server" containerID="cri-o://b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f" gracePeriod=2 Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.711798 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.897724 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9vcx\" (UniqueName: \"kubernetes.io/projected/2787e8f5-fb49-468f-8db7-afe2af0efa5f-kube-api-access-z9vcx\") pod \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.898047 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-catalog-content\") pod \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.898086 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-utilities\") pod \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\" (UID: \"2787e8f5-fb49-468f-8db7-afe2af0efa5f\") " Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.899065 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-utilities" (OuterVolumeSpecName: "utilities") pod "2787e8f5-fb49-468f-8db7-afe2af0efa5f" (UID: "2787e8f5-fb49-468f-8db7-afe2af0efa5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.903304 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2787e8f5-fb49-468f-8db7-afe2af0efa5f-kube-api-access-z9vcx" (OuterVolumeSpecName: "kube-api-access-z9vcx") pod "2787e8f5-fb49-468f-8db7-afe2af0efa5f" (UID: "2787e8f5-fb49-468f-8db7-afe2af0efa5f"). InnerVolumeSpecName "kube-api-access-z9vcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.955908 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2787e8f5-fb49-468f-8db7-afe2af0efa5f" (UID: "2787e8f5-fb49-468f-8db7-afe2af0efa5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.980584 4773 generic.go:334] "Generic (PLEG): container finished" podID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" containerID="b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f" exitCode=0 Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.980657 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dmwc5" Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.980669 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmwc5" event={"ID":"2787e8f5-fb49-468f-8db7-afe2af0efa5f","Type":"ContainerDied","Data":"b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f"} Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.980740 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmwc5" event={"ID":"2787e8f5-fb49-468f-8db7-afe2af0efa5f","Type":"ContainerDied","Data":"7e7772068df314e90468ec2fe9b32531a6eaa25a0b22fcb59f497e788d2da58a"} Jan 21 16:35:51 crc kubenswrapper[4773]: I0121 16:35:51.980761 4773 scope.go:117] "RemoveContainer" containerID="b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f" Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:51.999975 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9vcx\" (UniqueName: \"kubernetes.io/projected/2787e8f5-fb49-468f-8db7-afe2af0efa5f-kube-api-access-z9vcx\") on node \"crc\" DevicePath \"\"" Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.000005 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.000014 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2787e8f5-fb49-468f-8db7-afe2af0efa5f-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.022828 4773 scope.go:117] "RemoveContainer" containerID="82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac" Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.039957 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dmwc5"] Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.050765 4773 scope.go:117] "RemoveContainer" containerID="9e497e938b7b7e5f3ad0403056499b4bd38874ee570b4787209bb9416171191f" Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.052570 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dmwc5"] Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.098308 4773 scope.go:117] "RemoveContainer" containerID="b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f" Jan 21 16:35:52 crc kubenswrapper[4773]: E0121 16:35:52.098637 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f\": container with ID starting with b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f not found: ID does not exist" containerID="b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f" Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.098667 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f"} err="failed to get container status \"b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f\": rpc error: code = NotFound desc = could not find container \"b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f\": container with ID starting with b35f88dccf0c8c863230fde5f02dd958bc958cc418580a92c7f6b9ba20b9750f not found: ID does not exist" Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.098695 4773 scope.go:117] "RemoveContainer" containerID="82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac" Jan 21 16:35:52 crc kubenswrapper[4773]: E0121 16:35:52.099214 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac\": container with ID starting with 82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac not found: ID does not exist" containerID="82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac" Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.099254 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac"} err="failed to get container status \"82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac\": rpc error: code = NotFound desc = could not find container \"82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac\": container with ID starting with 82b5900f1c1a2b08d46d4624c3eb34d2ead3b21ae21b37072f95d0aea4d027ac not found: ID does not exist" Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.099289 4773 scope.go:117] "RemoveContainer" containerID="9e497e938b7b7e5f3ad0403056499b4bd38874ee570b4787209bb9416171191f" Jan 21 16:35:52 crc kubenswrapper[4773]: E0121 16:35:52.099660 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e497e938b7b7e5f3ad0403056499b4bd38874ee570b4787209bb9416171191f\": container with ID starting with 9e497e938b7b7e5f3ad0403056499b4bd38874ee570b4787209bb9416171191f not found: ID does not exist" containerID="9e497e938b7b7e5f3ad0403056499b4bd38874ee570b4787209bb9416171191f" Jan 21 16:35:52 crc kubenswrapper[4773]: I0121 16:35:52.099721 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e497e938b7b7e5f3ad0403056499b4bd38874ee570b4787209bb9416171191f"} err="failed to get container status \"9e497e938b7b7e5f3ad0403056499b4bd38874ee570b4787209bb9416171191f\": rpc error: code = NotFound desc = could not find container \"9e497e938b7b7e5f3ad0403056499b4bd38874ee570b4787209bb9416171191f\": container with ID starting with 9e497e938b7b7e5f3ad0403056499b4bd38874ee570b4787209bb9416171191f not found: ID does not exist" Jan 21 16:35:53 crc kubenswrapper[4773]: I0121 16:35:53.398114 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" path="/var/lib/kubelet/pods/2787e8f5-fb49-468f-8db7-afe2af0efa5f/volumes" Jan 21 16:35:55 crc kubenswrapper[4773]: I0121 16:35:55.206313 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:35:55 crc kubenswrapper[4773]: I0121 16:35:55.206656 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:36:25 crc kubenswrapper[4773]: I0121 16:36:25.205809 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:36:25 crc kubenswrapper[4773]: I0121 16:36:25.206313 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:36:25 crc kubenswrapper[4773]: I0121 16:36:25.206347 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 16:36:25 crc kubenswrapper[4773]: I0121 16:36:25.207061 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:36:25 crc kubenswrapper[4773]: I0121 16:36:25.207113 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" gracePeriod=600 Jan 21 16:36:25 crc kubenswrapper[4773]: E0121 16:36:25.334521 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:36:26 crc kubenswrapper[4773]: I0121 16:36:26.304812 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" exitCode=0 Jan 21 16:36:26 crc kubenswrapper[4773]: I0121 16:36:26.304927 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184"} Jan 21 16:36:26 crc kubenswrapper[4773]: I0121 16:36:26.305213 4773 scope.go:117] "RemoveContainer" containerID="5478b0f01dc74b7043f1932bc410ad3e4eb13ace26ba52d09d1d638ac3c6f2c1" Jan 21 16:36:26 crc kubenswrapper[4773]: I0121 16:36:26.306116 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:36:26 crc kubenswrapper[4773]: E0121 16:36:26.306540 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:36:40 crc kubenswrapper[4773]: I0121 16:36:40.384036 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:36:40 crc kubenswrapper[4773]: E0121 16:36:40.384788 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:36:54 crc kubenswrapper[4773]: I0121 16:36:54.383454 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:36:54 crc kubenswrapper[4773]: E0121 16:36:54.385365 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:37:07 crc kubenswrapper[4773]: I0121 16:37:07.383573 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:37:07 crc kubenswrapper[4773]: E0121 16:37:07.384535 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:37:21 crc kubenswrapper[4773]: I0121 16:37:21.385072 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:37:21 crc kubenswrapper[4773]: E0121 16:37:21.385861 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:37:36 crc kubenswrapper[4773]: I0121 16:37:36.383838 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:37:36 crc kubenswrapper[4773]: E0121 16:37:36.384669 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:37:50 crc kubenswrapper[4773]: I0121 16:37:50.383856 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:37:50 crc kubenswrapper[4773]: E0121 16:37:50.384597 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:38:04 crc kubenswrapper[4773]: I0121 16:38:04.384220 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:38:04 crc kubenswrapper[4773]: E0121 16:38:04.385012 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:38:19 crc kubenswrapper[4773]: I0121 16:38:19.384756 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:38:19 crc kubenswrapper[4773]: E0121 16:38:19.385464 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:38:31 crc kubenswrapper[4773]: I0121 16:38:31.383738 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:38:31 crc kubenswrapper[4773]: E0121 16:38:31.384459 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:38:43 crc kubenswrapper[4773]: I0121 16:38:43.383356 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:38:43 crc kubenswrapper[4773]: E0121 16:38:43.384112 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:38:57 crc kubenswrapper[4773]: I0121 16:38:57.384017 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:38:57 crc kubenswrapper[4773]: E0121 16:38:57.385121 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:39:34 crc kubenswrapper[4773]: E0121 16:39:34.418252 4773 kubelet.go:2359] "Skipping pod synchronization" err="container runtime is down" Jan 21 16:39:34 crc kubenswrapper[4773]: I0121 16:39:34.422626 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-tzlwr" podUID="219ae24e-95b5-4a93-b89b-335ef51b2166" containerName="registry-server" probeResult="failure" output="command timed out" Jan 21 16:39:34 crc kubenswrapper[4773]: I0121 16:39:34.454750 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="61b8d46c-ed1d-4e8c-9d65-4c901fc300e4" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:34.810114 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" podUID="83918de1-f089-46b5-99e4-b249fbe09d65" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.100:8081/readyz\": EOF" Jan 21 16:39:35 crc kubenswrapper[4773]: E0121 16:39:34.929204 4773 kubelet.go:2359] "Skipping pod synchronization" err="container runtime is down" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.139062 4773 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": EOF" start-of-body= Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.139384 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": EOF" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.231231 4773 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.231301 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.253396 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" podUID="d8cb173b-7eaa-4183-8028-0a1c4730097c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/readyz\": read tcp 10.217.0.2:55278->10.217.0.94:8081: read: connection reset by peer" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.253742 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" podUID="d8cb173b-7eaa-4183-8028-0a1c4730097c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/healthz\": read tcp 10.217.0.2:55294->10.217.0.94:8081: read: connection reset by peer" Jan 21 16:39:35 crc kubenswrapper[4773]: E0121 16:39:35.130302 4773 kubelet.go:2359] "Skipping pod synchronization" err="container runtime is down" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.304081 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" podUID="32f1de73-4ee0-4eda-8709-d1642d8452f2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/healthz\": dial tcp 10.217.0.85:8081: connect: connection refused" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.325718 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" podUID="83918de1-f089-46b5-99e4-b249fbe09d65" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.100:8081/healthz\": EOF" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.330388 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" podUID="d8cb173b-7eaa-4183-8028-0a1c4730097c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/readyz\": dial tcp 10.217.0.94:8081: connect: connection refused" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.342587 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" podUID="baf015b3-f5b5-4467-8469-bccd49ba94ae" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.77:8081/healthz\": dial tcp 10.217.0.77:8081: connect: connection refused" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.344433 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" podUID="baf015b3-f5b5-4467-8469-bccd49ba94ae" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": dial tcp 10.217.0.77:8081: connect: connection refused" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.363283 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" podUID="eeb5c272-4544-47a4-8d08-187872fea7bd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.98:8081/healthz\": dial tcp 10.217.0.98:8081: connect: connection refused" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.363385 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" podUID="eeb5c272-4544-47a4-8d08-187872fea7bd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.98:8081/readyz\": dial tcp 10.217.0.98:8081: connect: connection refused" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.390851 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" podUID="32f1de73-4ee0-4eda-8709-d1642d8452f2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.398662 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" podUID="83918de1-f089-46b5-99e4-b249fbe09d65" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.100:8081/readyz\": dial tcp 10.217.0.100:8081: connect: connection refused" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.418180 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="44cd53f0-37e0-4b02-9922-49d99dfee92a" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.124:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.431953 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-index-gateway-0" podUID="9e04988a-e98a-4c9d-9a51-e0e69a6810c9" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.125:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.436799 4773 patch_prober.go:28] interesting pod/loki-operator-controller-manager-55d66b9568-sfgsj container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": read tcp 10.217.0.2:54198->10.217.0.49:8081: read: connection reset by peer" start-of-body= Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.436839 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" podUID="e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": read tcp 10.217.0.2:54198->10.217.0.49:8081: read: connection reset by peer" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.436983 4773 patch_prober.go:28] interesting pod/loki-operator-controller-manager-55d66b9568-sfgsj container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.49:8081/healthz\": read tcp 10.217.0.2:54184->10.217.0.49:8081: read: connection reset by peer" start-of-body= Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.436997 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" podUID="e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/healthz\": read tcp 10.217.0.2:54184->10.217.0.49:8081: read: connection reset by peer" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.494855 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-wptww" podUID="0968840f-f0d5-4b41-8f6f-00b88d26758e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:39:35 crc kubenswrapper[4773]: I0121 16:39:35.535492 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:39:35 crc kubenswrapper[4773]: E0121 16:39:35.535811 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:35.769939 4773 trace.go:236] Trace[1706462338]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (21-Jan-2026 16:39:34.418) (total time: 1351ms): Jan 21 16:39:36 crc kubenswrapper[4773]: Trace[1706462338]: [1.351299642s] [1.351299642s] END Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:35.830194 4773 trace.go:236] Trace[1155990309]: "Calculate volume metrics of glance for pod openstack/glance-default-external-api-0" (21-Jan-2026 16:39:34.679) (total time: 1150ms): Jan 21 16:39:36 crc kubenswrapper[4773]: Trace[1155990309]: [1.150225577s] [1.150225577s] END Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:35.857009 4773 trace.go:236] Trace[1558241512]: "Calculate volume metrics of glance for pod openstack/glance-default-internal-api-0" (21-Jan-2026 16:39:34.415) (total time: 1441ms): Jan 21 16:39:36 crc kubenswrapper[4773]: Trace[1558241512]: [1.441783171s] [1.441783171s] END Jan 21 16:39:36 crc kubenswrapper[4773]: E0121 16:39:36.402356 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83918de1_f089_46b5_99e4_b249fbe09d65.slice/crio-2b7ef5f159947b733acd00ac1e6cac4912faafd514203b87f141f5e37c1b9593.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-conmon-4d1eb5666e49486c3f4eb94fc5433702d0d7a2ca2bd72e231c235b7e2964ebfa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83918de1_f089_46b5_99e4_b249fbe09d65.slice/crio-conmon-2b7ef5f159947b733acd00ac1e6cac4912faafd514203b87f141f5e37c1b9593.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32f1de73_4ee0_4eda_8709_d1642d8452f2.slice/crio-conmon-b8a940e20c1b234f977ea1ee3559836d89597f4e87057cdae07793b34ffa8187.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode47fdd5c_ea4d_4bf2_a46b_f1a2d7cbfd20.slice/crio-78056926acc618baf4eb1e65107b6152d575381c535a2a8a77816e6103ead07e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode47fdd5c_ea4d_4bf2_a46b_f1a2d7cbfd20.slice/crio-conmon-78056926acc618baf4eb1e65107b6152d575381c535a2a8a77816e6103ead07e.scope\": RecentStats: unable to find data in memory cache]" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.866300 4773 generic.go:334] "Generic (PLEG): container finished" podID="bfc5861c-71cc-4485-8fbd-cc661354fe03" containerID="aa7285e5731cbc9a2cb99d963ada37c331cb13e70c84423c583ae6f54aca8a1d" exitCode=1 Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.866464 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" event={"ID":"bfc5861c-71cc-4485-8fbd-cc661354fe03","Type":"ContainerDied","Data":"aa7285e5731cbc9a2cb99d963ada37c331cb13e70c84423c583ae6f54aca8a1d"} Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.867315 4773 scope.go:117] "RemoveContainer" containerID="aa7285e5731cbc9a2cb99d963ada37c331cb13e70c84423c583ae6f54aca8a1d" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.872472 4773 generic.go:334] "Generic (PLEG): container finished" podID="eeb5c272-4544-47a4-8d08-187872fea7bd" containerID="f70f54f0a982017893ff69fda35238edecc629ac57c7578a1e297a66bd5073cb" exitCode=1 Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.872766 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" event={"ID":"eeb5c272-4544-47a4-8d08-187872fea7bd","Type":"ContainerDied","Data":"f70f54f0a982017893ff69fda35238edecc629ac57c7578a1e297a66bd5073cb"} Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.874181 4773 scope.go:117] "RemoveContainer" containerID="f70f54f0a982017893ff69fda35238edecc629ac57c7578a1e297a66bd5073cb" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.877000 4773 generic.go:334] "Generic (PLEG): container finished" podID="83918de1-f089-46b5-99e4-b249fbe09d65" containerID="2b7ef5f159947b733acd00ac1e6cac4912faafd514203b87f141f5e37c1b9593" exitCode=1 Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.877110 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" event={"ID":"83918de1-f089-46b5-99e4-b249fbe09d65","Type":"ContainerDied","Data":"2b7ef5f159947b733acd00ac1e6cac4912faafd514203b87f141f5e37c1b9593"} Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.879276 4773 scope.go:117] "RemoveContainer" containerID="2b7ef5f159947b733acd00ac1e6cac4912faafd514203b87f141f5e37c1b9593" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.885251 4773 generic.go:334] "Generic (PLEG): container finished" podID="baf015b3-f5b5-4467-8469-bccd49ba94ae" containerID="9d84d6a45260cc040a48bc617674e09a8660a75757925e87a582c65e8ed90cf9" exitCode=1 Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.885341 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" event={"ID":"baf015b3-f5b5-4467-8469-bccd49ba94ae","Type":"ContainerDied","Data":"9d84d6a45260cc040a48bc617674e09a8660a75757925e87a582c65e8ed90cf9"} Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.885886 4773 scope.go:117] "RemoveContainer" containerID="9d84d6a45260cc040a48bc617674e09a8660a75757925e87a582c65e8ed90cf9" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.888527 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.896858 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.896933 4773 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4d1eb5666e49486c3f4eb94fc5433702d0d7a2ca2bd72e231c235b7e2964ebfa" exitCode=1 Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.897075 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4d1eb5666e49486c3f4eb94fc5433702d0d7a2ca2bd72e231c235b7e2964ebfa"} Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.897113 4773 scope.go:117] "RemoveContainer" containerID="e40a9f09ff0c668fa81fcf3c277a660b94ce5b742062914dbf2df8620d761c2a" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.898111 4773 scope.go:117] "RemoveContainer" containerID="4d1eb5666e49486c3f4eb94fc5433702d0d7a2ca2bd72e231c235b7e2964ebfa" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.902156 4773 generic.go:334] "Generic (PLEG): container finished" podID="d8cb173b-7eaa-4183-8028-0a1c4730097c" containerID="0b19b670bef9c4a202ea9fc1002429def9982670adf36293952817b0d3aebba5" exitCode=1 Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.902296 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" event={"ID":"d8cb173b-7eaa-4183-8028-0a1c4730097c","Type":"ContainerDied","Data":"0b19b670bef9c4a202ea9fc1002429def9982670adf36293952817b0d3aebba5"} Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.903768 4773 scope.go:117] "RemoveContainer" containerID="0b19b670bef9c4a202ea9fc1002429def9982670adf36293952817b0d3aebba5" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.938797 4773 generic.go:334] "Generic (PLEG): container finished" podID="e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20" containerID="78056926acc618baf4eb1e65107b6152d575381c535a2a8a77816e6103ead07e" exitCode=1 Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.938900 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" event={"ID":"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20","Type":"ContainerDied","Data":"78056926acc618baf4eb1e65107b6152d575381c535a2a8a77816e6103ead07e"} Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.939540 4773 scope.go:117] "RemoveContainer" containerID="78056926acc618baf4eb1e65107b6152d575381c535a2a8a77816e6103ead07e" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.945303 4773 generic.go:334] "Generic (PLEG): container finished" podID="ddac57c3-b102-4cfc-8b1e-53de342cef39" containerID="e1b3cc99fef6273461589e900133af9bcfa65f3809f9d55a0102370fa453455d" exitCode=1 Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.945570 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" event={"ID":"ddac57c3-b102-4cfc-8b1e-53de342cef39","Type":"ContainerDied","Data":"e1b3cc99fef6273461589e900133af9bcfa65f3809f9d55a0102370fa453455d"} Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.946504 4773 scope.go:117] "RemoveContainer" containerID="e1b3cc99fef6273461589e900133af9bcfa65f3809f9d55a0102370fa453455d" Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.970314 4773 generic.go:334] "Generic (PLEG): container finished" podID="32f1de73-4ee0-4eda-8709-d1642d8452f2" containerID="b8a940e20c1b234f977ea1ee3559836d89597f4e87057cdae07793b34ffa8187" exitCode=1 Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.970367 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" event={"ID":"32f1de73-4ee0-4eda-8709-d1642d8452f2","Type":"ContainerDied","Data":"b8a940e20c1b234f977ea1ee3559836d89597f4e87057cdae07793b34ffa8187"} Jan 21 16:39:36 crc kubenswrapper[4773]: I0121 16:39:36.971119 4773 scope.go:117] "RemoveContainer" containerID="b8a940e20c1b234f977ea1ee3559836d89597f4e87057cdae07793b34ffa8187" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.087387 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.536069 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pqbmp"] Jan 21 16:39:37 crc kubenswrapper[4773]: E0121 16:39:37.536570 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" containerName="extract-utilities" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.536597 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" containerName="extract-utilities" Jan 21 16:39:37 crc kubenswrapper[4773]: E0121 16:39:37.536628 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" containerName="registry-server" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.536638 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" containerName="registry-server" Jan 21 16:39:37 crc kubenswrapper[4773]: E0121 16:39:37.536651 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" containerName="extract-content" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.536658 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" containerName="extract-content" Jan 21 16:39:37 crc kubenswrapper[4773]: E0121 16:39:37.536674 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9abb9509-5532-46eb-985b-50ba1cb08eb7" containerName="registry-server" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.536683 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9abb9509-5532-46eb-985b-50ba1cb08eb7" containerName="registry-server" Jan 21 16:39:37 crc kubenswrapper[4773]: E0121 16:39:37.536713 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9abb9509-5532-46eb-985b-50ba1cb08eb7" containerName="extract-utilities" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.536723 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9abb9509-5532-46eb-985b-50ba1cb08eb7" containerName="extract-utilities" Jan 21 16:39:37 crc kubenswrapper[4773]: E0121 16:39:37.536742 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9abb9509-5532-46eb-985b-50ba1cb08eb7" containerName="extract-content" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.536752 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9abb9509-5532-46eb-985b-50ba1cb08eb7" containerName="extract-content" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.536985 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2787e8f5-fb49-468f-8db7-afe2af0efa5f" containerName="registry-server" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.537010 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9abb9509-5532-46eb-985b-50ba1cb08eb7" containerName="registry-server" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.538680 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.566551 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqbmp"] Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.613405 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lqqt\" (UniqueName: \"kubernetes.io/projected/81fa3e1f-6a75-4349-bb16-3610dea4518b-kube-api-access-4lqqt\") pod \"redhat-operators-pqbmp\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.613463 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-utilities\") pod \"redhat-operators-pqbmp\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.613552 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-catalog-content\") pod \"redhat-operators-pqbmp\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.717131 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-catalog-content\") pod \"redhat-operators-pqbmp\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.717348 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lqqt\" (UniqueName: \"kubernetes.io/projected/81fa3e1f-6a75-4349-bb16-3610dea4518b-kube-api-access-4lqqt\") pod \"redhat-operators-pqbmp\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.717378 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-utilities\") pod \"redhat-operators-pqbmp\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.727596 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-utilities\") pod \"redhat-operators-pqbmp\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.733105 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-catalog-content\") pod \"redhat-operators-pqbmp\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.777745 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lqqt\" (UniqueName: \"kubernetes.io/projected/81fa3e1f-6a75-4349-bb16-3610dea4518b-kube-api-access-4lqqt\") pod \"redhat-operators-pqbmp\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.862525 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.984286 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" event={"ID":"bfc5861c-71cc-4485-8fbd-cc661354fe03","Type":"ContainerStarted","Data":"18e6677490f2746f51877482731ea0350885c541d06a254435eb10c726a2d312"} Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.984580 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.988624 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" event={"ID":"eeb5c272-4544-47a4-8d08-187872fea7bd","Type":"ContainerStarted","Data":"598d0945178f47b2e93bb2fbeae487a168aed6f14c433131997e5d4452e10a2d"} Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.989534 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.991393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" event={"ID":"ddac57c3-b102-4cfc-8b1e-53de342cef39","Type":"ContainerStarted","Data":"39435f2fbb978c409fb6dac80294472586262b631cde122e8a82d4ac6f3fb50e"} Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.991915 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.996662 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" event={"ID":"32f1de73-4ee0-4eda-8709-d1642d8452f2","Type":"ContainerStarted","Data":"d788c2b79d39041b64d3a36b69dc710e2af877c7862e870e568abca262504660"} Jan 21 16:39:37 crc kubenswrapper[4773]: I0121 16:39:37.997530 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" Jan 21 16:39:38 crc kubenswrapper[4773]: I0121 16:39:38.000337 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 21 16:39:38 crc kubenswrapper[4773]: I0121 16:39:38.034094 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" event={"ID":"d8cb173b-7eaa-4183-8028-0a1c4730097c","Type":"ContainerStarted","Data":"f5f9f9b91a6733b122e58c5f05bd2f33d14e5f58d79e39d0f2a9c30c1c669f2e"} Jan 21 16:39:38 crc kubenswrapper[4773]: I0121 16:39:38.035539 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 16:39:38 crc kubenswrapper[4773]: I0121 16:39:38.068406 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" event={"ID":"e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20","Type":"ContainerStarted","Data":"f89e7666d342312b7a72b9f5bcedb47ce32f7c88f7b18eda53538deb6cd7e430"} Jan 21 16:39:38 crc kubenswrapper[4773]: I0121 16:39:38.069771 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 16:39:38 crc kubenswrapper[4773]: I0121 16:39:38.080631 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" event={"ID":"baf015b3-f5b5-4467-8469-bccd49ba94ae","Type":"ContainerStarted","Data":"648f73ae180442287ddb30a71cc23da87642c3c50e30c513c331e78bfae3af6f"} Jan 21 16:39:38 crc kubenswrapper[4773]: I0121 16:39:38.081014 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" Jan 21 16:39:38 crc kubenswrapper[4773]: I0121 16:39:38.095513 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" event={"ID":"83918de1-f089-46b5-99e4-b249fbe09d65","Type":"ContainerStarted","Data":"95847e0815cf102706b21d6547c2cdb481edc4fef409cb3b565ba0654b10b8c3"} Jan 21 16:39:38 crc kubenswrapper[4773]: I0121 16:39:38.096392 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 16:39:38 crc kubenswrapper[4773]: I0121 16:39:38.516239 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqbmp"] Jan 21 16:39:39 crc kubenswrapper[4773]: I0121 16:39:39.109239 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqbmp" event={"ID":"81fa3e1f-6a75-4349-bb16-3610dea4518b","Type":"ContainerStarted","Data":"d1610258e387e6af79c4c21b24da4a45f36d1a1431c5576a453f2112e88da7e2"} Jan 21 16:39:39 crc kubenswrapper[4773]: I0121 16:39:39.205797 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 16:39:39 crc kubenswrapper[4773]: I0121 16:39:39.992210 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 16:39:40 crc kubenswrapper[4773]: I0121 16:39:40.132458 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 21 16:39:40 crc kubenswrapper[4773]: I0121 16:39:40.134776 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae7f82a8ad04d5320675f96dc85385f17ba541aa4d70198d445fe2e54285951f"} Jan 21 16:39:40 crc kubenswrapper[4773]: I0121 16:39:40.137500 4773 generic.go:334] "Generic (PLEG): container finished" podID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerID="9acfc5630770b967aab6d987527828d1ec027b6c40f9e70bcbdeab6f586c6e7e" exitCode=0 Jan 21 16:39:40 crc kubenswrapper[4773]: I0121 16:39:40.137707 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqbmp" event={"ID":"81fa3e1f-6a75-4349-bb16-3610dea4518b","Type":"ContainerDied","Data":"9acfc5630770b967aab6d987527828d1ec027b6c40f9e70bcbdeab6f586c6e7e"} Jan 21 16:39:43 crc kubenswrapper[4773]: I0121 16:39:43.369063 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-57bcf57cd7-v96fc" Jan 21 16:39:44 crc kubenswrapper[4773]: I0121 16:39:44.335855 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 16:39:45 crc kubenswrapper[4773]: I0121 16:39:45.303538 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7c6777f5bd-z474b" Jan 21 16:39:45 crc kubenswrapper[4773]: I0121 16:39:45.778058 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" podUID="d8cb173b-7eaa-4183-8028-0a1c4730097c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:39:46 crc kubenswrapper[4773]: I0121 16:39:46.811926 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="61b8d46c-ed1d-4e8c-9d65-4c901fc300e4" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Jan 21 16:39:46 crc kubenswrapper[4773]: I0121 16:39:46.811974 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="61b8d46c-ed1d-4e8c-9d65-4c901fc300e4" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 21 16:39:47 crc kubenswrapper[4773]: I0121 16:39:47.089357 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" Jan 21 16:39:47 crc kubenswrapper[4773]: I0121 16:39:47.384660 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:39:47 crc kubenswrapper[4773]: E0121 16:39:47.384947 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:39:48 crc kubenswrapper[4773]: I0121 16:39:48.890420 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-c79x5" Jan 21 16:39:49 crc kubenswrapper[4773]: I0121 16:39:49.241111 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-kn644" Jan 21 16:39:49 crc kubenswrapper[4773]: I0121 16:39:49.558394 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-94knv" Jan 21 16:39:51 crc kubenswrapper[4773]: I0121 16:39:51.812303 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="61b8d46c-ed1d-4e8c-9d65-4c901fc300e4" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 21 16:39:55 crc kubenswrapper[4773]: I0121 16:39:55.777027 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" podUID="d8cb173b-7eaa-4183-8028-0a1c4730097c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:39:56 crc kubenswrapper[4773]: I0121 16:39:56.809843 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="61b8d46c-ed1d-4e8c-9d65-4c901fc300e4" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 21 16:39:56 crc kubenswrapper[4773]: I0121 16:39:56.809959 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Jan 21 16:39:56 crc kubenswrapper[4773]: I0121 16:39:56.811208 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"6c7a6fa36cb3f836b086715d46cf7bead180db530e686065163d192999568b0a"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Jan 21 16:39:56 crc kubenswrapper[4773]: I0121 16:39:56.811303 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61b8d46c-ed1d-4e8c-9d65-4c901fc300e4" containerName="ceilometer-central-agent" containerID="cri-o://6c7a6fa36cb3f836b086715d46cf7bead180db530e686065163d192999568b0a" gracePeriod=30 Jan 21 16:40:00 crc kubenswrapper[4773]: I0121 16:40:00.282254 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-lsnb5" podUID="6f1b2b84-a0ef-43f5-987e-3960271487b8" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:40:00 crc kubenswrapper[4773]: I0121 16:40:00.541940 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2bt26" podUID="6411a5d3-7b7b-4735-b01c-7c4aa0d5509c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:40:00 crc kubenswrapper[4773]: I0121 16:40:00.541942 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-xs55j" podUID="7ab140ad-f64b-45e3-a393-f66567e98a9f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.96:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:40:00 crc kubenswrapper[4773]: I0121 16:40:00.585918 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-5c4ff57dc8-78tss" podUID="4a9d0079-9636-4913-95fd-305e8d54280d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.97:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:40:00 crc kubenswrapper[4773]: I0121 16:40:00.772806 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="0182c704-9c2c-460e-8fb3-083edaa77855" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.229:8080/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:40:02 crc kubenswrapper[4773]: I0121 16:40:02.387643 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:40:02 crc kubenswrapper[4773]: E0121 16:40:02.389623 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:40:03 crc kubenswrapper[4773]: I0121 16:40:03.400330 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 16:40:03 crc kubenswrapper[4773]: I0121 16:40:03.400390 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 16:40:05 crc kubenswrapper[4773]: E0121 16:40:05.283584 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:40:05 crc kubenswrapper[4773]: E0121 16:40:05.811165 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T16:39:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T16:39:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T16:39:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T16:39:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 16:40:05 crc kubenswrapper[4773]: I0121 16:40:05.818110 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" podUID="d8cb173b-7eaa-4183-8028-0a1c4730097c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:40:05 crc kubenswrapper[4773]: I0121 16:40:05.818110 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" podUID="d8cb173b-7eaa-4183-8028-0a1c4730097c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:40:08 crc kubenswrapper[4773]: I0121 16:40:08.456203 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqbmp" event={"ID":"81fa3e1f-6a75-4349-bb16-3610dea4518b","Type":"ContainerStarted","Data":"2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f"} Jan 21 16:40:08 crc kubenswrapper[4773]: I0121 16:40:08.461052 4773 generic.go:334] "Generic (PLEG): container finished" podID="61b8d46c-ed1d-4e8c-9d65-4c901fc300e4" containerID="6c7a6fa36cb3f836b086715d46cf7bead180db530e686065163d192999568b0a" exitCode=0 Jan 21 16:40:08 crc kubenswrapper[4773]: I0121 16:40:08.461114 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4","Type":"ContainerDied","Data":"6c7a6fa36cb3f836b086715d46cf7bead180db530e686065163d192999568b0a"} Jan 21 16:40:09 crc kubenswrapper[4773]: I0121 16:40:09.205826 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 16:40:09 crc kubenswrapper[4773]: I0121 16:40:09.475811 4773 generic.go:334] "Generic (PLEG): container finished" podID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerID="2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f" exitCode=0 Jan 21 16:40:09 crc kubenswrapper[4773]: I0121 16:40:09.475858 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqbmp" event={"ID":"81fa3e1f-6a75-4349-bb16-3610dea4518b","Type":"ContainerDied","Data":"2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f"} Jan 21 16:40:10 crc kubenswrapper[4773]: I0121 16:40:10.135543 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5dcc476cd5-nk2zw" Jan 21 16:40:13 crc kubenswrapper[4773]: I0121 16:40:13.807094 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="fd966624-79ad-4926-9253-741b8f1e6fe4" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 21 16:40:13 crc kubenswrapper[4773]: I0121 16:40:13.807312 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="fd966624-79ad-4926-9253-741b8f1e6fe4" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.335630 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.335947 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zk5kl"] Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.338357 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.349581 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.386754 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:40:14 crc kubenswrapper[4773]: E0121 16:40:14.386965 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.400964 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zk5kl"] Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.416880 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7pm5\" (UniqueName: \"kubernetes.io/projected/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-kube-api-access-n7pm5\") pod \"redhat-marketplace-zk5kl\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.417076 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-catalog-content\") pod \"redhat-marketplace-zk5kl\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.417181 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-utilities\") pod \"redhat-marketplace-zk5kl\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.519098 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-utilities\") pod \"redhat-marketplace-zk5kl\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.519202 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7pm5\" (UniqueName: \"kubernetes.io/projected/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-kube-api-access-n7pm5\") pod \"redhat-marketplace-zk5kl\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.519297 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-catalog-content\") pod \"redhat-marketplace-zk5kl\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.519957 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-catalog-content\") pod \"redhat-marketplace-zk5kl\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.519957 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-utilities\") pod \"redhat-marketplace-zk5kl\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.535259 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 16:40:14 crc kubenswrapper[4773]: I0121 16:40:14.739928 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zfrqz" Jan 21 16:40:17 crc kubenswrapper[4773]: I0121 16:40:17.808129 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="61b8d46c-ed1d-4e8c-9d65-4c901fc300e4" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Jan 21 16:40:18 crc kubenswrapper[4773]: I0121 16:40:18.128941 4773 patch_prober.go:28] interesting pod/loki-operator-controller-manager-55d66b9568-sfgsj container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 16:40:18 crc kubenswrapper[4773]: I0121 16:40:18.129016 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-55d66b9568-sfgsj" podUID="e47fdd5c-ea4d-4bf2-a46b-f1a2d7cbfd20" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 16:40:19 crc kubenswrapper[4773]: I0121 16:40:19.721354 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7pm5\" (UniqueName: \"kubernetes.io/projected/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-kube-api-access-n7pm5\") pod \"redhat-marketplace-zk5kl\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:19 crc kubenswrapper[4773]: I0121 16:40:19.804708 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:20 crc kubenswrapper[4773]: I0121 16:40:20.836446 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zk5kl"] Jan 21 16:40:21 crc kubenswrapper[4773]: I0121 16:40:21.608636 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b8d46c-ed1d-4e8c-9d65-4c901fc300e4","Type":"ContainerStarted","Data":"1e74ae20c47e6ded1524c47bfc147dafe09ef056cfb309782ae0e70184fa3ccc"} Jan 21 16:40:21 crc kubenswrapper[4773]: I0121 16:40:21.611147 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk5kl" event={"ID":"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6","Type":"ContainerStarted","Data":"e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39"} Jan 21 16:40:21 crc kubenswrapper[4773]: I0121 16:40:21.611181 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk5kl" event={"ID":"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6","Type":"ContainerStarted","Data":"e4948cf4bc15e01bc86bf6556fbbe5783d5d049216bb7a0da6f572c997132ef1"} Jan 21 16:40:21 crc kubenswrapper[4773]: I0121 16:40:21.615077 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqbmp" event={"ID":"81fa3e1f-6a75-4349-bb16-3610dea4518b","Type":"ContainerStarted","Data":"1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870"} Jan 21 16:40:21 crc kubenswrapper[4773]: I0121 16:40:21.685258 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pqbmp" podStartSLOduration=4.027225385 podStartE2EDuration="44.685238573s" podCreationTimestamp="2026-01-21 16:39:37 +0000 UTC" firstStartedPulling="2026-01-21 16:39:40.146983394 +0000 UTC m=+4545.071473016" lastFinishedPulling="2026-01-21 16:40:20.804996572 +0000 UTC m=+4585.729486204" observedRunningTime="2026-01-21 16:40:21.678839989 +0000 UTC m=+4586.603329611" watchObservedRunningTime="2026-01-21 16:40:21.685238573 +0000 UTC m=+4586.609728195" Jan 21 16:40:23 crc kubenswrapper[4773]: I0121 16:40:23.634894 4773 generic.go:334] "Generic (PLEG): container finished" podID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerID="e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39" exitCode=0 Jan 21 16:40:23 crc kubenswrapper[4773]: I0121 16:40:23.634957 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk5kl" event={"ID":"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6","Type":"ContainerDied","Data":"e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39"} Jan 21 16:40:25 crc kubenswrapper[4773]: I0121 16:40:25.661966 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk5kl" event={"ID":"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6","Type":"ContainerStarted","Data":"ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38"} Jan 21 16:40:26 crc kubenswrapper[4773]: I0121 16:40:26.674428 4773 generic.go:334] "Generic (PLEG): container finished" podID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerID="ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38" exitCode=0 Jan 21 16:40:26 crc kubenswrapper[4773]: I0121 16:40:26.674530 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk5kl" event={"ID":"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6","Type":"ContainerDied","Data":"ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38"} Jan 21 16:40:27 crc kubenswrapper[4773]: I0121 16:40:27.690479 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk5kl" event={"ID":"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6","Type":"ContainerStarted","Data":"86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435"} Jan 21 16:40:27 crc kubenswrapper[4773]: I0121 16:40:27.713275 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zk5kl" podStartSLOduration=10.260621699 podStartE2EDuration="13.713254012s" podCreationTimestamp="2026-01-21 16:40:14 +0000 UTC" firstStartedPulling="2026-01-21 16:40:23.63804678 +0000 UTC m=+4588.562536402" lastFinishedPulling="2026-01-21 16:40:27.090679093 +0000 UTC m=+4592.015168715" observedRunningTime="2026-01-21 16:40:27.710062216 +0000 UTC m=+4592.634551848" watchObservedRunningTime="2026-01-21 16:40:27.713254012 +0000 UTC m=+4592.637743634" Jan 21 16:40:27 crc kubenswrapper[4773]: I0121 16:40:27.863129 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:40:27 crc kubenswrapper[4773]: I0121 16:40:27.863461 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:40:28 crc kubenswrapper[4773]: I0121 16:40:28.917156 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pqbmp" podUID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerName="registry-server" probeResult="failure" output=< Jan 21 16:40:28 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 16:40:28 crc kubenswrapper[4773]: > Jan 21 16:40:29 crc kubenswrapper[4773]: I0121 16:40:29.383523 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:40:29 crc kubenswrapper[4773]: E0121 16:40:29.383945 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:40:29 crc kubenswrapper[4773]: I0121 16:40:29.806855 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:29 crc kubenswrapper[4773]: I0121 16:40:29.806913 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:30 crc kubenswrapper[4773]: I0121 16:40:30.861888 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-zk5kl" podUID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerName="registry-server" probeResult="failure" output=< Jan 21 16:40:30 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 16:40:30 crc kubenswrapper[4773]: > Jan 21 16:40:37 crc kubenswrapper[4773]: I0121 16:40:37.917276 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:40:37 crc kubenswrapper[4773]: I0121 16:40:37.968383 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:40:38 crc kubenswrapper[4773]: I0121 16:40:38.793836 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqbmp"] Jan 21 16:40:39 crc kubenswrapper[4773]: I0121 16:40:39.814737 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pqbmp" podUID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerName="registry-server" containerID="cri-o://1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870" gracePeriod=2 Jan 21 16:40:39 crc kubenswrapper[4773]: I0121 16:40:39.873670 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:39 crc kubenswrapper[4773]: I0121 16:40:39.932148 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.383924 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:40:40 crc kubenswrapper[4773]: E0121 16:40:40.384404 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.500167 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.592449 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-utilities\") pod \"81fa3e1f-6a75-4349-bb16-3610dea4518b\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.593166 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-catalog-content\") pod \"81fa3e1f-6a75-4349-bb16-3610dea4518b\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.593111 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-utilities" (OuterVolumeSpecName: "utilities") pod "81fa3e1f-6a75-4349-bb16-3610dea4518b" (UID: "81fa3e1f-6a75-4349-bb16-3610dea4518b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.595825 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lqqt\" (UniqueName: \"kubernetes.io/projected/81fa3e1f-6a75-4349-bb16-3610dea4518b-kube-api-access-4lqqt\") pod \"81fa3e1f-6a75-4349-bb16-3610dea4518b\" (UID: \"81fa3e1f-6a75-4349-bb16-3610dea4518b\") " Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.596750 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.601511 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81fa3e1f-6a75-4349-bb16-3610dea4518b-kube-api-access-4lqqt" (OuterVolumeSpecName: "kube-api-access-4lqqt") pod "81fa3e1f-6a75-4349-bb16-3610dea4518b" (UID: "81fa3e1f-6a75-4349-bb16-3610dea4518b"). InnerVolumeSpecName "kube-api-access-4lqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.699411 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lqqt\" (UniqueName: \"kubernetes.io/projected/81fa3e1f-6a75-4349-bb16-3610dea4518b-kube-api-access-4lqqt\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.715742 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81fa3e1f-6a75-4349-bb16-3610dea4518b" (UID: "81fa3e1f-6a75-4349-bb16-3610dea4518b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.801031 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fa3e1f-6a75-4349-bb16-3610dea4518b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.832123 4773 generic.go:334] "Generic (PLEG): container finished" podID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerID="1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870" exitCode=0 Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.833935 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqbmp" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.833930 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqbmp" event={"ID":"81fa3e1f-6a75-4349-bb16-3610dea4518b","Type":"ContainerDied","Data":"1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870"} Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.833994 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqbmp" event={"ID":"81fa3e1f-6a75-4349-bb16-3610dea4518b","Type":"ContainerDied","Data":"d1610258e387e6af79c4c21b24da4a45f36d1a1431c5576a453f2112e88da7e2"} Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.834021 4773 scope.go:117] "RemoveContainer" containerID="1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.857519 4773 scope.go:117] "RemoveContainer" containerID="2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.872120 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqbmp"] Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.881710 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pqbmp"] Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.896444 4773 scope.go:117] "RemoveContainer" containerID="9acfc5630770b967aab6d987527828d1ec027b6c40f9e70bcbdeab6f586c6e7e" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.930708 4773 scope.go:117] "RemoveContainer" containerID="1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870" Jan 21 16:40:40 crc kubenswrapper[4773]: E0121 16:40:40.931208 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870\": container with ID starting with 1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870 not found: ID does not exist" containerID="1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.931249 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870"} err="failed to get container status \"1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870\": rpc error: code = NotFound desc = could not find container \"1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870\": container with ID starting with 1967fd165c74d053ba1856c2ed121ad12a8c06d34f883de8d077b436c6da1870 not found: ID does not exist" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.931276 4773 scope.go:117] "RemoveContainer" containerID="2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f" Jan 21 16:40:40 crc kubenswrapper[4773]: E0121 16:40:40.931574 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f\": container with ID starting with 2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f not found: ID does not exist" containerID="2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.931607 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f"} err="failed to get container status \"2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f\": rpc error: code = NotFound desc = could not find container \"2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f\": container with ID starting with 2698aa66cdde7def6e301428c9658cf44d9f9a3344e9437992d3a110404a6c0f not found: ID does not exist" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.931628 4773 scope.go:117] "RemoveContainer" containerID="9acfc5630770b967aab6d987527828d1ec027b6c40f9e70bcbdeab6f586c6e7e" Jan 21 16:40:40 crc kubenswrapper[4773]: E0121 16:40:40.932113 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9acfc5630770b967aab6d987527828d1ec027b6c40f9e70bcbdeab6f586c6e7e\": container with ID starting with 9acfc5630770b967aab6d987527828d1ec027b6c40f9e70bcbdeab6f586c6e7e not found: ID does not exist" containerID="9acfc5630770b967aab6d987527828d1ec027b6c40f9e70bcbdeab6f586c6e7e" Jan 21 16:40:40 crc kubenswrapper[4773]: I0121 16:40:40.932139 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9acfc5630770b967aab6d987527828d1ec027b6c40f9e70bcbdeab6f586c6e7e"} err="failed to get container status \"9acfc5630770b967aab6d987527828d1ec027b6c40f9e70bcbdeab6f586c6e7e\": rpc error: code = NotFound desc = could not find container \"9acfc5630770b967aab6d987527828d1ec027b6c40f9e70bcbdeab6f586c6e7e\": container with ID starting with 9acfc5630770b967aab6d987527828d1ec027b6c40f9e70bcbdeab6f586c6e7e not found: ID does not exist" Jan 21 16:40:41 crc kubenswrapper[4773]: I0121 16:40:41.399286 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81fa3e1f-6a75-4349-bb16-3610dea4518b" path="/var/lib/kubelet/pods/81fa3e1f-6a75-4349-bb16-3610dea4518b/volumes" Jan 21 16:40:41 crc kubenswrapper[4773]: I0121 16:40:41.792225 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zk5kl"] Jan 21 16:40:41 crc kubenswrapper[4773]: I0121 16:40:41.842749 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zk5kl" podUID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerName="registry-server" containerID="cri-o://86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435" gracePeriod=2 Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.479856 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.534347 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7pm5\" (UniqueName: \"kubernetes.io/projected/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-kube-api-access-n7pm5\") pod \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.534565 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-catalog-content\") pod \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.534603 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-utilities\") pod \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\" (UID: \"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6\") " Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.535550 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-utilities" (OuterVolumeSpecName: "utilities") pod "1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" (UID: "1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.536051 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.552119 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-kube-api-access-n7pm5" (OuterVolumeSpecName: "kube-api-access-n7pm5") pod "1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" (UID: "1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6"). InnerVolumeSpecName "kube-api-access-n7pm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.558170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" (UID: "1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.638510 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.638550 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7pm5\" (UniqueName: \"kubernetes.io/projected/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6-kube-api-access-n7pm5\") on node \"crc\" DevicePath \"\"" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.855853 4773 generic.go:334] "Generic (PLEG): container finished" podID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerID="86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435" exitCode=0 Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.855908 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk5kl" event={"ID":"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6","Type":"ContainerDied","Data":"86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435"} Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.855936 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk5kl" event={"ID":"1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6","Type":"ContainerDied","Data":"e4948cf4bc15e01bc86bf6556fbbe5783d5d049216bb7a0da6f572c997132ef1"} Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.855971 4773 scope.go:117] "RemoveContainer" containerID="86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.856116 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zk5kl" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.882326 4773 scope.go:117] "RemoveContainer" containerID="ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.910079 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zk5kl"] Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.920300 4773 scope.go:117] "RemoveContainer" containerID="e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.922107 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zk5kl"] Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.969035 4773 scope.go:117] "RemoveContainer" containerID="86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435" Jan 21 16:40:42 crc kubenswrapper[4773]: E0121 16:40:42.969782 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435\": container with ID starting with 86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435 not found: ID does not exist" containerID="86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.969815 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435"} err="failed to get container status \"86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435\": rpc error: code = NotFound desc = could not find container \"86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435\": container with ID starting with 86582fa314a7102c5de05c80b5ca1a6a03a6c5f5dcb093dfeb2bc7aaeba8b435 not found: ID does not exist" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.969835 4773 scope.go:117] "RemoveContainer" containerID="ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38" Jan 21 16:40:42 crc kubenswrapper[4773]: E0121 16:40:42.970257 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38\": container with ID starting with ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38 not found: ID does not exist" containerID="ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.970311 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38"} err="failed to get container status \"ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38\": rpc error: code = NotFound desc = could not find container \"ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38\": container with ID starting with ed2e80be23796ceb497f223fda4d0e7b85dbec6096901ddb175229b46fd66c38 not found: ID does not exist" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.970344 4773 scope.go:117] "RemoveContainer" containerID="e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39" Jan 21 16:40:42 crc kubenswrapper[4773]: E0121 16:40:42.970671 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39\": container with ID starting with e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39 not found: ID does not exist" containerID="e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39" Jan 21 16:40:42 crc kubenswrapper[4773]: I0121 16:40:42.970719 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39"} err="failed to get container status \"e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39\": rpc error: code = NotFound desc = could not find container \"e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39\": container with ID starting with e617a479280b765d18c99f006f9cc4838fd5373d4d9c32fa26dcb233d642fb39 not found: ID does not exist" Jan 21 16:40:43 crc kubenswrapper[4773]: I0121 16:40:43.396608 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" path="/var/lib/kubelet/pods/1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6/volumes" Jan 21 16:40:53 crc kubenswrapper[4773]: I0121 16:40:53.384061 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:40:53 crc kubenswrapper[4773]: E0121 16:40:53.384931 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:41:05 crc kubenswrapper[4773]: I0121 16:41:05.397410 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:41:05 crc kubenswrapper[4773]: E0121 16:41:05.399343 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:41:19 crc kubenswrapper[4773]: I0121 16:41:19.383576 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:41:19 crc kubenswrapper[4773]: E0121 16:41:19.384567 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:41:30 crc kubenswrapper[4773]: I0121 16:41:30.383684 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:41:31 crc kubenswrapper[4773]: I0121 16:41:31.328559 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"62d6e49d526fdc55abf704a47cfad8ee9474c196f164d218ccac83c0a2599585"} Jan 21 16:43:55 crc kubenswrapper[4773]: I0121 16:43:55.205946 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:43:55 crc kubenswrapper[4773]: I0121 16:43:55.206632 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:44:25 crc kubenswrapper[4773]: I0121 16:44:25.206230 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:44:25 crc kubenswrapper[4773]: I0121 16:44:25.206795 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:44:55 crc kubenswrapper[4773]: I0121 16:44:55.206270 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:44:55 crc kubenswrapper[4773]: I0121 16:44:55.208052 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:44:55 crc kubenswrapper[4773]: I0121 16:44:55.208158 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 16:44:55 crc kubenswrapper[4773]: I0121 16:44:55.208983 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62d6e49d526fdc55abf704a47cfad8ee9474c196f164d218ccac83c0a2599585"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:44:55 crc kubenswrapper[4773]: I0121 16:44:55.209119 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://62d6e49d526fdc55abf704a47cfad8ee9474c196f164d218ccac83c0a2599585" gracePeriod=600 Jan 21 16:44:55 crc kubenswrapper[4773]: I0121 16:44:55.358547 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="62d6e49d526fdc55abf704a47cfad8ee9474c196f164d218ccac83c0a2599585" exitCode=0 Jan 21 16:44:55 crc kubenswrapper[4773]: I0121 16:44:55.358630 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"62d6e49d526fdc55abf704a47cfad8ee9474c196f164d218ccac83c0a2599585"} Jan 21 16:44:55 crc kubenswrapper[4773]: I0121 16:44:55.359226 4773 scope.go:117] "RemoveContainer" containerID="9256c0d16b095a2f53f222061b31658e592771833bb477786d473b333a647184" Jan 21 16:44:56 crc kubenswrapper[4773]: I0121 16:44:56.369632 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668"} Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.217630 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f"] Jan 21 16:45:00 crc kubenswrapper[4773]: E0121 16:45:00.218885 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerName="registry-server" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.218902 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerName="registry-server" Jan 21 16:45:00 crc kubenswrapper[4773]: E0121 16:45:00.218926 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerName="extract-content" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.218935 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerName="extract-content" Jan 21 16:45:00 crc kubenswrapper[4773]: E0121 16:45:00.218953 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerName="extract-content" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.218961 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerName="extract-content" Jan 21 16:45:00 crc kubenswrapper[4773]: E0121 16:45:00.218986 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerName="registry-server" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.219040 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerName="registry-server" Jan 21 16:45:00 crc kubenswrapper[4773]: E0121 16:45:00.219054 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerName="extract-utilities" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.219062 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerName="extract-utilities" Jan 21 16:45:00 crc kubenswrapper[4773]: E0121 16:45:00.219083 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerName="extract-utilities" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.219090 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerName="extract-utilities" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.219328 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d85cf6a-ecb2-4fd3-bc31-199b5abdfef6" containerName="registry-server" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.219359 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="81fa3e1f-6a75-4349-bb16-3610dea4518b" containerName="registry-server" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.220367 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.222641 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.222789 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.230652 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f"] Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.265773 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-config-volume\") pod \"collect-profiles-29483565-npv4f\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.265916 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-secret-volume\") pod \"collect-profiles-29483565-npv4f\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.266005 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mpd2\" (UniqueName: \"kubernetes.io/projected/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-kube-api-access-2mpd2\") pod \"collect-profiles-29483565-npv4f\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.369686 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-secret-volume\") pod \"collect-profiles-29483565-npv4f\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.369812 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mpd2\" (UniqueName: \"kubernetes.io/projected/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-kube-api-access-2mpd2\") pod \"collect-profiles-29483565-npv4f\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.369974 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-config-volume\") pod \"collect-profiles-29483565-npv4f\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.371278 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-config-volume\") pod \"collect-profiles-29483565-npv4f\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.385558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-secret-volume\") pod \"collect-profiles-29483565-npv4f\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.395380 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mpd2\" (UniqueName: \"kubernetes.io/projected/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-kube-api-access-2mpd2\") pod \"collect-profiles-29483565-npv4f\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:00 crc kubenswrapper[4773]: I0121 16:45:00.550469 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:01 crc kubenswrapper[4773]: I0121 16:45:01.144779 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f"] Jan 21 16:45:01 crc kubenswrapper[4773]: I0121 16:45:01.425127 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" event={"ID":"5a941a6b-6c87-4175-9ac1-13ce0de5fe96","Type":"ContainerStarted","Data":"5e57a545ab1639ce8261766044994afdde602e4794fa6476ff95bdc3133e32f8"} Jan 21 16:45:01 crc kubenswrapper[4773]: I0121 16:45:01.425198 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" event={"ID":"5a941a6b-6c87-4175-9ac1-13ce0de5fe96","Type":"ContainerStarted","Data":"3cb4e0ac5fdf69004f00f57a8197740531b92f8a816ab24a5d892a1949fc542e"} Jan 21 16:45:01 crc kubenswrapper[4773]: I0121 16:45:01.458951 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" podStartSLOduration=1.458929646 podStartE2EDuration="1.458929646s" podCreationTimestamp="2026-01-21 16:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 16:45:01.448266337 +0000 UTC m=+4866.372755949" watchObservedRunningTime="2026-01-21 16:45:01.458929646 +0000 UTC m=+4866.383419268" Jan 21 16:45:02 crc kubenswrapper[4773]: I0121 16:45:02.435339 4773 generic.go:334] "Generic (PLEG): container finished" podID="5a941a6b-6c87-4175-9ac1-13ce0de5fe96" containerID="5e57a545ab1639ce8261766044994afdde602e4794fa6476ff95bdc3133e32f8" exitCode=0 Jan 21 16:45:02 crc kubenswrapper[4773]: I0121 16:45:02.435676 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" event={"ID":"5a941a6b-6c87-4175-9ac1-13ce0de5fe96","Type":"ContainerDied","Data":"5e57a545ab1639ce8261766044994afdde602e4794fa6476ff95bdc3133e32f8"} Jan 21 16:45:03 crc kubenswrapper[4773]: I0121 16:45:03.975878 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.056106 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-config-volume\") pod \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.056512 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-secret-volume\") pod \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.056669 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mpd2\" (UniqueName: \"kubernetes.io/projected/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-kube-api-access-2mpd2\") pod \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\" (UID: \"5a941a6b-6c87-4175-9ac1-13ce0de5fe96\") " Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.056585 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-config-volume" (OuterVolumeSpecName: "config-volume") pod "5a941a6b-6c87-4175-9ac1-13ce0de5fe96" (UID: "5a941a6b-6c87-4175-9ac1-13ce0de5fe96"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.057417 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.064857 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5a941a6b-6c87-4175-9ac1-13ce0de5fe96" (UID: "5a941a6b-6c87-4175-9ac1-13ce0de5fe96"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.075658 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-kube-api-access-2mpd2" (OuterVolumeSpecName: "kube-api-access-2mpd2") pod "5a941a6b-6c87-4175-9ac1-13ce0de5fe96" (UID: "5a941a6b-6c87-4175-9ac1-13ce0de5fe96"). InnerVolumeSpecName "kube-api-access-2mpd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.159168 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.159446 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mpd2\" (UniqueName: \"kubernetes.io/projected/5a941a6b-6c87-4175-9ac1-13ce0de5fe96-kube-api-access-2mpd2\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.459054 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" event={"ID":"5a941a6b-6c87-4175-9ac1-13ce0de5fe96","Type":"ContainerDied","Data":"3cb4e0ac5fdf69004f00f57a8197740531b92f8a816ab24a5d892a1949fc542e"} Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.459114 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cb4e0ac5fdf69004f00f57a8197740531b92f8a816ab24a5d892a1949fc542e" Jan 21 16:45:04 crc kubenswrapper[4773]: I0121 16:45:04.459158 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483565-npv4f" Jan 21 16:45:05 crc kubenswrapper[4773]: I0121 16:45:05.076312 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9"] Jan 21 16:45:05 crc kubenswrapper[4773]: I0121 16:45:05.085976 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483520-rxgw9"] Jan 21 16:45:05 crc kubenswrapper[4773]: I0121 16:45:05.395915 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="289fa5e2-355b-4012-853c-aad63a1cc1fe" path="/var/lib/kubelet/pods/289fa5e2-355b-4012-853c-aad63a1cc1fe/volumes" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.060522 4773 scope.go:117] "RemoveContainer" containerID="86bf2dfabc1458b3bb6ece8f85223de64a2c69bac7c905cc191e6d8383fac4a5" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.748111 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f44r2"] Jan 21 16:45:36 crc kubenswrapper[4773]: E0121 16:45:36.748928 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a941a6b-6c87-4175-9ac1-13ce0de5fe96" containerName="collect-profiles" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.748950 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a941a6b-6c87-4175-9ac1-13ce0de5fe96" containerName="collect-profiles" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.749195 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a941a6b-6c87-4175-9ac1-13ce0de5fe96" containerName="collect-profiles" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.752413 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.770228 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f44r2"] Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.867042 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cpxs\" (UniqueName: \"kubernetes.io/projected/f97bb491-3df8-44a8-bd8d-888886e2256f-kube-api-access-7cpxs\") pod \"certified-operators-f44r2\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.867106 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-utilities\") pod \"certified-operators-f44r2\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.867146 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-catalog-content\") pod \"certified-operators-f44r2\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.969296 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-utilities\") pod \"certified-operators-f44r2\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.969403 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-catalog-content\") pod \"certified-operators-f44r2\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.969667 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cpxs\" (UniqueName: \"kubernetes.io/projected/f97bb491-3df8-44a8-bd8d-888886e2256f-kube-api-access-7cpxs\") pod \"certified-operators-f44r2\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.969873 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-utilities\") pod \"certified-operators-f44r2\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:36 crc kubenswrapper[4773]: I0121 16:45:36.970027 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-catalog-content\") pod \"certified-operators-f44r2\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:37 crc kubenswrapper[4773]: I0121 16:45:37.116599 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cpxs\" (UniqueName: \"kubernetes.io/projected/f97bb491-3df8-44a8-bd8d-888886e2256f-kube-api-access-7cpxs\") pod \"certified-operators-f44r2\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:37 crc kubenswrapper[4773]: I0121 16:45:37.392020 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:37 crc kubenswrapper[4773]: I0121 16:45:37.929555 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f44r2"] Jan 21 16:45:38 crc kubenswrapper[4773]: I0121 16:45:38.812839 4773 generic.go:334] "Generic (PLEG): container finished" podID="f97bb491-3df8-44a8-bd8d-888886e2256f" containerID="5e0a714fbd1d58c2941867f7ba20b8648706dbf429b70f7f8985037a9e6a148a" exitCode=0 Jan 21 16:45:38 crc kubenswrapper[4773]: I0121 16:45:38.812949 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f44r2" event={"ID":"f97bb491-3df8-44a8-bd8d-888886e2256f","Type":"ContainerDied","Data":"5e0a714fbd1d58c2941867f7ba20b8648706dbf429b70f7f8985037a9e6a148a"} Jan 21 16:45:38 crc kubenswrapper[4773]: I0121 16:45:38.814094 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f44r2" event={"ID":"f97bb491-3df8-44a8-bd8d-888886e2256f","Type":"ContainerStarted","Data":"dd8998db81595b65ee989e7f2b143888b3ad72f3ba671c22f09630c5a96f355b"} Jan 21 16:45:38 crc kubenswrapper[4773]: I0121 16:45:38.814866 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:45:39 crc kubenswrapper[4773]: I0121 16:45:39.827063 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f44r2" event={"ID":"f97bb491-3df8-44a8-bd8d-888886e2256f","Type":"ContainerStarted","Data":"44d4a431e9fc67f77435d463078bbae3cd5bffbae41f7d1234639bcdab45bf2e"} Jan 21 16:45:40 crc kubenswrapper[4773]: I0121 16:45:40.837887 4773 generic.go:334] "Generic (PLEG): container finished" podID="f97bb491-3df8-44a8-bd8d-888886e2256f" containerID="44d4a431e9fc67f77435d463078bbae3cd5bffbae41f7d1234639bcdab45bf2e" exitCode=0 Jan 21 16:45:40 crc kubenswrapper[4773]: I0121 16:45:40.837987 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f44r2" event={"ID":"f97bb491-3df8-44a8-bd8d-888886e2256f","Type":"ContainerDied","Data":"44d4a431e9fc67f77435d463078bbae3cd5bffbae41f7d1234639bcdab45bf2e"} Jan 21 16:45:41 crc kubenswrapper[4773]: I0121 16:45:41.853050 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f44r2" event={"ID":"f97bb491-3df8-44a8-bd8d-888886e2256f","Type":"ContainerStarted","Data":"5e689b0e0207e95ac0bed3926295e528099671216e82b15295e85cfa23894c1f"} Jan 21 16:45:41 crc kubenswrapper[4773]: I0121 16:45:41.879644 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f44r2" podStartSLOduration=3.449337854 podStartE2EDuration="5.879624816s" podCreationTimestamp="2026-01-21 16:45:36 +0000 UTC" firstStartedPulling="2026-01-21 16:45:38.814582005 +0000 UTC m=+4903.739071627" lastFinishedPulling="2026-01-21 16:45:41.244868967 +0000 UTC m=+4906.169358589" observedRunningTime="2026-01-21 16:45:41.871576268 +0000 UTC m=+4906.796065920" watchObservedRunningTime="2026-01-21 16:45:41.879624816 +0000 UTC m=+4906.804114438" Jan 21 16:45:47 crc kubenswrapper[4773]: I0121 16:45:47.396770 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:47 crc kubenswrapper[4773]: I0121 16:45:47.397316 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:47 crc kubenswrapper[4773]: I0121 16:45:47.445957 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:48 crc kubenswrapper[4773]: I0121 16:45:48.380758 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:48 crc kubenswrapper[4773]: I0121 16:45:48.444116 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f44r2"] Jan 21 16:45:49 crc kubenswrapper[4773]: I0121 16:45:49.927785 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f44r2" podUID="f97bb491-3df8-44a8-bd8d-888886e2256f" containerName="registry-server" containerID="cri-o://5e689b0e0207e95ac0bed3926295e528099671216e82b15295e85cfa23894c1f" gracePeriod=2 Jan 21 16:45:50 crc kubenswrapper[4773]: I0121 16:45:50.971915 4773 generic.go:334] "Generic (PLEG): container finished" podID="f97bb491-3df8-44a8-bd8d-888886e2256f" containerID="5e689b0e0207e95ac0bed3926295e528099671216e82b15295e85cfa23894c1f" exitCode=0 Jan 21 16:45:50 crc kubenswrapper[4773]: I0121 16:45:50.972221 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f44r2" event={"ID":"f97bb491-3df8-44a8-bd8d-888886e2256f","Type":"ContainerDied","Data":"5e689b0e0207e95ac0bed3926295e528099671216e82b15295e85cfa23894c1f"} Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.400610 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.467617 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-utilities\") pod \"f97bb491-3df8-44a8-bd8d-888886e2256f\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.467931 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-catalog-content\") pod \"f97bb491-3df8-44a8-bd8d-888886e2256f\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.467984 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cpxs\" (UniqueName: \"kubernetes.io/projected/f97bb491-3df8-44a8-bd8d-888886e2256f-kube-api-access-7cpxs\") pod \"f97bb491-3df8-44a8-bd8d-888886e2256f\" (UID: \"f97bb491-3df8-44a8-bd8d-888886e2256f\") " Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.470339 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-utilities" (OuterVolumeSpecName: "utilities") pod "f97bb491-3df8-44a8-bd8d-888886e2256f" (UID: "f97bb491-3df8-44a8-bd8d-888886e2256f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.477339 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f97bb491-3df8-44a8-bd8d-888886e2256f-kube-api-access-7cpxs" (OuterVolumeSpecName: "kube-api-access-7cpxs") pod "f97bb491-3df8-44a8-bd8d-888886e2256f" (UID: "f97bb491-3df8-44a8-bd8d-888886e2256f"). InnerVolumeSpecName "kube-api-access-7cpxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.521792 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f97bb491-3df8-44a8-bd8d-888886e2256f" (UID: "f97bb491-3df8-44a8-bd8d-888886e2256f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.571266 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.571298 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cpxs\" (UniqueName: \"kubernetes.io/projected/f97bb491-3df8-44a8-bd8d-888886e2256f-kube-api-access-7cpxs\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.571308 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f97bb491-3df8-44a8-bd8d-888886e2256f-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.983437 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f44r2" event={"ID":"f97bb491-3df8-44a8-bd8d-888886e2256f","Type":"ContainerDied","Data":"dd8998db81595b65ee989e7f2b143888b3ad72f3ba671c22f09630c5a96f355b"} Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.983748 4773 scope.go:117] "RemoveContainer" containerID="5e689b0e0207e95ac0bed3926295e528099671216e82b15295e85cfa23894c1f" Jan 21 16:45:51 crc kubenswrapper[4773]: I0121 16:45:51.983867 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f44r2" Jan 21 16:45:52 crc kubenswrapper[4773]: I0121 16:45:52.017992 4773 scope.go:117] "RemoveContainer" containerID="44d4a431e9fc67f77435d463078bbae3cd5bffbae41f7d1234639bcdab45bf2e" Jan 21 16:45:52 crc kubenswrapper[4773]: I0121 16:45:52.027595 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f44r2"] Jan 21 16:45:52 crc kubenswrapper[4773]: I0121 16:45:52.040919 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f44r2"] Jan 21 16:45:52 crc kubenswrapper[4773]: I0121 16:45:52.043456 4773 scope.go:117] "RemoveContainer" containerID="5e0a714fbd1d58c2941867f7ba20b8648706dbf429b70f7f8985037a9e6a148a" Jan 21 16:45:53 crc kubenswrapper[4773]: I0121 16:45:53.396548 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f97bb491-3df8-44a8-bd8d-888886e2256f" path="/var/lib/kubelet/pods/f97bb491-3df8-44a8-bd8d-888886e2256f/volumes" Jan 21 16:46:17 crc kubenswrapper[4773]: I0121 16:46:17.682953 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6dcbb8dcff-bjhnc" podUID="22988650-1474-4ba4-a6c0-2deb003ae3e7" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 21 16:46:21 crc kubenswrapper[4773]: I0121 16:46:21.877127 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hp5v4"] Jan 21 16:46:21 crc kubenswrapper[4773]: E0121 16:46:21.879275 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f97bb491-3df8-44a8-bd8d-888886e2256f" containerName="extract-utilities" Jan 21 16:46:21 crc kubenswrapper[4773]: I0121 16:46:21.879389 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f97bb491-3df8-44a8-bd8d-888886e2256f" containerName="extract-utilities" Jan 21 16:46:21 crc kubenswrapper[4773]: E0121 16:46:21.879479 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f97bb491-3df8-44a8-bd8d-888886e2256f" containerName="extract-content" Jan 21 16:46:21 crc kubenswrapper[4773]: I0121 16:46:21.879554 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f97bb491-3df8-44a8-bd8d-888886e2256f" containerName="extract-content" Jan 21 16:46:21 crc kubenswrapper[4773]: E0121 16:46:21.879657 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f97bb491-3df8-44a8-bd8d-888886e2256f" containerName="registry-server" Jan 21 16:46:21 crc kubenswrapper[4773]: I0121 16:46:21.879761 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f97bb491-3df8-44a8-bd8d-888886e2256f" containerName="registry-server" Jan 21 16:46:21 crc kubenswrapper[4773]: I0121 16:46:21.880068 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f97bb491-3df8-44a8-bd8d-888886e2256f" containerName="registry-server" Jan 21 16:46:21 crc kubenswrapper[4773]: I0121 16:46:21.882090 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:21 crc kubenswrapper[4773]: I0121 16:46:21.906025 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hp5v4"] Jan 21 16:46:22 crc kubenswrapper[4773]: I0121 16:46:22.011558 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vrql\" (UniqueName: \"kubernetes.io/projected/6653112c-c990-4e6c-8a40-22a53a265339-kube-api-access-2vrql\") pod \"community-operators-hp5v4\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:22 crc kubenswrapper[4773]: I0121 16:46:22.011619 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-catalog-content\") pod \"community-operators-hp5v4\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:22 crc kubenswrapper[4773]: I0121 16:46:22.011887 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-utilities\") pod \"community-operators-hp5v4\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:22 crc kubenswrapper[4773]: I0121 16:46:22.113752 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-utilities\") pod \"community-operators-hp5v4\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:22 crc kubenswrapper[4773]: I0121 16:46:22.114078 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vrql\" (UniqueName: \"kubernetes.io/projected/6653112c-c990-4e6c-8a40-22a53a265339-kube-api-access-2vrql\") pod \"community-operators-hp5v4\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:22 crc kubenswrapper[4773]: I0121 16:46:22.114184 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-catalog-content\") pod \"community-operators-hp5v4\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:22 crc kubenswrapper[4773]: I0121 16:46:22.114667 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-catalog-content\") pod \"community-operators-hp5v4\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:22 crc kubenswrapper[4773]: I0121 16:46:22.114865 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-utilities\") pod \"community-operators-hp5v4\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:22 crc kubenswrapper[4773]: I0121 16:46:22.145396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vrql\" (UniqueName: \"kubernetes.io/projected/6653112c-c990-4e6c-8a40-22a53a265339-kube-api-access-2vrql\") pod \"community-operators-hp5v4\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:22 crc kubenswrapper[4773]: I0121 16:46:22.214386 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:22 crc kubenswrapper[4773]: I0121 16:46:22.841768 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hp5v4"] Jan 21 16:46:23 crc kubenswrapper[4773]: I0121 16:46:23.322892 4773 generic.go:334] "Generic (PLEG): container finished" podID="6653112c-c990-4e6c-8a40-22a53a265339" containerID="6dd41f7c6a5afe406ccd1cc29e4f9a51389809592f160d8ba9ebb9e74b8a090b" exitCode=0 Jan 21 16:46:23 crc kubenswrapper[4773]: I0121 16:46:23.323009 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hp5v4" event={"ID":"6653112c-c990-4e6c-8a40-22a53a265339","Type":"ContainerDied","Data":"6dd41f7c6a5afe406ccd1cc29e4f9a51389809592f160d8ba9ebb9e74b8a090b"} Jan 21 16:46:23 crc kubenswrapper[4773]: I0121 16:46:23.323405 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hp5v4" event={"ID":"6653112c-c990-4e6c-8a40-22a53a265339","Type":"ContainerStarted","Data":"c221f146da62199c054e21390d9669649c945cb2bc6185d37e722208816e3bef"} Jan 21 16:46:24 crc kubenswrapper[4773]: I0121 16:46:24.336916 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hp5v4" event={"ID":"6653112c-c990-4e6c-8a40-22a53a265339","Type":"ContainerStarted","Data":"6ee0173984e7854b01550d3121af5c83465d85331b9fd357f7f1ef04de58b698"} Jan 21 16:46:25 crc kubenswrapper[4773]: I0121 16:46:25.347083 4773 generic.go:334] "Generic (PLEG): container finished" podID="6653112c-c990-4e6c-8a40-22a53a265339" containerID="6ee0173984e7854b01550d3121af5c83465d85331b9fd357f7f1ef04de58b698" exitCode=0 Jan 21 16:46:25 crc kubenswrapper[4773]: I0121 16:46:25.347135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hp5v4" event={"ID":"6653112c-c990-4e6c-8a40-22a53a265339","Type":"ContainerDied","Data":"6ee0173984e7854b01550d3121af5c83465d85331b9fd357f7f1ef04de58b698"} Jan 21 16:46:27 crc kubenswrapper[4773]: I0121 16:46:27.377142 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hp5v4" event={"ID":"6653112c-c990-4e6c-8a40-22a53a265339","Type":"ContainerStarted","Data":"8dc9c6bc1e6f64dfe6f79b22cb4da24c3f6d5b42446be59c3b6412ffa518eb64"} Jan 21 16:46:27 crc kubenswrapper[4773]: I0121 16:46:27.406752 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hp5v4" podStartSLOduration=3.956083776 podStartE2EDuration="6.406729961s" podCreationTimestamp="2026-01-21 16:46:21 +0000 UTC" firstStartedPulling="2026-01-21 16:46:23.326129582 +0000 UTC m=+4948.250619204" lastFinishedPulling="2026-01-21 16:46:25.776775767 +0000 UTC m=+4950.701265389" observedRunningTime="2026-01-21 16:46:27.399743491 +0000 UTC m=+4952.324233113" watchObservedRunningTime="2026-01-21 16:46:27.406729961 +0000 UTC m=+4952.331219583" Jan 21 16:46:32 crc kubenswrapper[4773]: I0121 16:46:32.215995 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:32 crc kubenswrapper[4773]: I0121 16:46:32.216558 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:32 crc kubenswrapper[4773]: I0121 16:46:32.270971 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:32 crc kubenswrapper[4773]: I0121 16:46:32.471172 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:32 crc kubenswrapper[4773]: I0121 16:46:32.529128 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hp5v4"] Jan 21 16:46:34 crc kubenswrapper[4773]: I0121 16:46:34.442310 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hp5v4" podUID="6653112c-c990-4e6c-8a40-22a53a265339" containerName="registry-server" containerID="cri-o://8dc9c6bc1e6f64dfe6f79b22cb4da24c3f6d5b42446be59c3b6412ffa518eb64" gracePeriod=2 Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.452505 4773 generic.go:334] "Generic (PLEG): container finished" podID="6653112c-c990-4e6c-8a40-22a53a265339" containerID="8dc9c6bc1e6f64dfe6f79b22cb4da24c3f6d5b42446be59c3b6412ffa518eb64" exitCode=0 Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.452542 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hp5v4" event={"ID":"6653112c-c990-4e6c-8a40-22a53a265339","Type":"ContainerDied","Data":"8dc9c6bc1e6f64dfe6f79b22cb4da24c3f6d5b42446be59c3b6412ffa518eb64"} Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.589065 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.701214 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vrql\" (UniqueName: \"kubernetes.io/projected/6653112c-c990-4e6c-8a40-22a53a265339-kube-api-access-2vrql\") pod \"6653112c-c990-4e6c-8a40-22a53a265339\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.701392 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-catalog-content\") pod \"6653112c-c990-4e6c-8a40-22a53a265339\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.701647 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-utilities\") pod \"6653112c-c990-4e6c-8a40-22a53a265339\" (UID: \"6653112c-c990-4e6c-8a40-22a53a265339\") " Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.702351 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-utilities" (OuterVolumeSpecName: "utilities") pod "6653112c-c990-4e6c-8a40-22a53a265339" (UID: "6653112c-c990-4e6c-8a40-22a53a265339"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.703000 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.708010 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6653112c-c990-4e6c-8a40-22a53a265339-kube-api-access-2vrql" (OuterVolumeSpecName: "kube-api-access-2vrql") pod "6653112c-c990-4e6c-8a40-22a53a265339" (UID: "6653112c-c990-4e6c-8a40-22a53a265339"). InnerVolumeSpecName "kube-api-access-2vrql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.753334 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6653112c-c990-4e6c-8a40-22a53a265339" (UID: "6653112c-c990-4e6c-8a40-22a53a265339"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.806070 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6653112c-c990-4e6c-8a40-22a53a265339-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:35 crc kubenswrapper[4773]: I0121 16:46:35.806145 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vrql\" (UniqueName: \"kubernetes.io/projected/6653112c-c990-4e6c-8a40-22a53a265339-kube-api-access-2vrql\") on node \"crc\" DevicePath \"\"" Jan 21 16:46:36 crc kubenswrapper[4773]: I0121 16:46:36.463340 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hp5v4" event={"ID":"6653112c-c990-4e6c-8a40-22a53a265339","Type":"ContainerDied","Data":"c221f146da62199c054e21390d9669649c945cb2bc6185d37e722208816e3bef"} Jan 21 16:46:36 crc kubenswrapper[4773]: I0121 16:46:36.463680 4773 scope.go:117] "RemoveContainer" containerID="8dc9c6bc1e6f64dfe6f79b22cb4da24c3f6d5b42446be59c3b6412ffa518eb64" Jan 21 16:46:36 crc kubenswrapper[4773]: I0121 16:46:36.463852 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hp5v4" Jan 21 16:46:36 crc kubenswrapper[4773]: I0121 16:46:36.498044 4773 scope.go:117] "RemoveContainer" containerID="6ee0173984e7854b01550d3121af5c83465d85331b9fd357f7f1ef04de58b698" Jan 21 16:46:36 crc kubenswrapper[4773]: I0121 16:46:36.502028 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hp5v4"] Jan 21 16:46:36 crc kubenswrapper[4773]: I0121 16:46:36.512289 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hp5v4"] Jan 21 16:46:36 crc kubenswrapper[4773]: I0121 16:46:36.529019 4773 scope.go:117] "RemoveContainer" containerID="6dd41f7c6a5afe406ccd1cc29e4f9a51389809592f160d8ba9ebb9e74b8a090b" Jan 21 16:46:37 crc kubenswrapper[4773]: I0121 16:46:37.397555 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6653112c-c990-4e6c-8a40-22a53a265339" path="/var/lib/kubelet/pods/6653112c-c990-4e6c-8a40-22a53a265339/volumes" Jan 21 16:46:55 crc kubenswrapper[4773]: I0121 16:46:55.206028 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:46:55 crc kubenswrapper[4773]: I0121 16:46:55.206552 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:47:25 crc kubenswrapper[4773]: I0121 16:47:25.205444 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:47:25 crc kubenswrapper[4773]: I0121 16:47:25.206006 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:47:55 crc kubenswrapper[4773]: I0121 16:47:55.206180 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:47:55 crc kubenswrapper[4773]: I0121 16:47:55.206750 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:47:55 crc kubenswrapper[4773]: I0121 16:47:55.206804 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 16:47:55 crc kubenswrapper[4773]: I0121 16:47:55.207647 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:47:55 crc kubenswrapper[4773]: I0121 16:47:55.207740 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" gracePeriod=600 Jan 21 16:47:55 crc kubenswrapper[4773]: E0121 16:47:55.328153 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:47:56 crc kubenswrapper[4773]: I0121 16:47:56.247547 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" exitCode=0 Jan 21 16:47:56 crc kubenswrapper[4773]: I0121 16:47:56.247629 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668"} Jan 21 16:47:56 crc kubenswrapper[4773]: I0121 16:47:56.248250 4773 scope.go:117] "RemoveContainer" containerID="62d6e49d526fdc55abf704a47cfad8ee9474c196f164d218ccac83c0a2599585" Jan 21 16:47:56 crc kubenswrapper[4773]: I0121 16:47:56.249074 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:47:56 crc kubenswrapper[4773]: E0121 16:47:56.249591 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:48:06 crc kubenswrapper[4773]: I0121 16:48:06.384221 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:48:06 crc kubenswrapper[4773]: E0121 16:48:06.385055 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:48:19 crc kubenswrapper[4773]: I0121 16:48:19.383802 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:48:19 crc kubenswrapper[4773]: E0121 16:48:19.384540 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:48:30 crc kubenswrapper[4773]: I0121 16:48:30.385100 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:48:30 crc kubenswrapper[4773]: E0121 16:48:30.386618 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:48:44 crc kubenswrapper[4773]: I0121 16:48:44.384460 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:48:44 crc kubenswrapper[4773]: E0121 16:48:44.385586 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:48:59 crc kubenswrapper[4773]: I0121 16:48:59.384829 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:48:59 crc kubenswrapper[4773]: E0121 16:48:59.386758 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:49:10 crc kubenswrapper[4773]: I0121 16:49:10.383796 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:49:10 crc kubenswrapper[4773]: E0121 16:49:10.384575 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:49:23 crc kubenswrapper[4773]: I0121 16:49:23.384993 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:49:23 crc kubenswrapper[4773]: E0121 16:49:23.387260 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:49:36 crc kubenswrapper[4773]: I0121 16:49:36.383726 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:49:36 crc kubenswrapper[4773]: E0121 16:49:36.384444 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:49:51 crc kubenswrapper[4773]: I0121 16:49:51.384116 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:49:51 crc kubenswrapper[4773]: E0121 16:49:51.384887 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:50:02 crc kubenswrapper[4773]: I0121 16:50:02.384962 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:50:02 crc kubenswrapper[4773]: E0121 16:50:02.385845 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:50:13 crc kubenswrapper[4773]: I0121 16:50:13.384843 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:50:13 crc kubenswrapper[4773]: E0121 16:50:13.387889 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:50:26 crc kubenswrapper[4773]: I0121 16:50:26.384674 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:50:26 crc kubenswrapper[4773]: E0121 16:50:26.385524 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:50:32 crc kubenswrapper[4773]: I0121 16:50:32.954877 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-42kkr"] Jan 21 16:50:32 crc kubenswrapper[4773]: E0121 16:50:32.955958 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6653112c-c990-4e6c-8a40-22a53a265339" containerName="registry-server" Jan 21 16:50:32 crc kubenswrapper[4773]: I0121 16:50:32.955992 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6653112c-c990-4e6c-8a40-22a53a265339" containerName="registry-server" Jan 21 16:50:32 crc kubenswrapper[4773]: E0121 16:50:32.956035 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6653112c-c990-4e6c-8a40-22a53a265339" containerName="extract-utilities" Jan 21 16:50:32 crc kubenswrapper[4773]: I0121 16:50:32.956044 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6653112c-c990-4e6c-8a40-22a53a265339" containerName="extract-utilities" Jan 21 16:50:32 crc kubenswrapper[4773]: E0121 16:50:32.956067 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6653112c-c990-4e6c-8a40-22a53a265339" containerName="extract-content" Jan 21 16:50:32 crc kubenswrapper[4773]: I0121 16:50:32.956075 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6653112c-c990-4e6c-8a40-22a53a265339" containerName="extract-content" Jan 21 16:50:32 crc kubenswrapper[4773]: I0121 16:50:32.956325 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6653112c-c990-4e6c-8a40-22a53a265339" containerName="registry-server" Jan 21 16:50:32 crc kubenswrapper[4773]: I0121 16:50:32.962870 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:32 crc kubenswrapper[4773]: I0121 16:50:32.968910 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-42kkr"] Jan 21 16:50:33 crc kubenswrapper[4773]: I0121 16:50:33.053293 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq5rt\" (UniqueName: \"kubernetes.io/projected/05b2b505-374c-438a-936e-cc9209f9359a-kube-api-access-tq5rt\") pod \"redhat-operators-42kkr\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:33 crc kubenswrapper[4773]: I0121 16:50:33.053399 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-utilities\") pod \"redhat-operators-42kkr\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:33 crc kubenswrapper[4773]: I0121 16:50:33.053435 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-catalog-content\") pod \"redhat-operators-42kkr\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:33 crc kubenswrapper[4773]: I0121 16:50:33.154825 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq5rt\" (UniqueName: \"kubernetes.io/projected/05b2b505-374c-438a-936e-cc9209f9359a-kube-api-access-tq5rt\") pod \"redhat-operators-42kkr\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:33 crc kubenswrapper[4773]: I0121 16:50:33.154928 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-utilities\") pod \"redhat-operators-42kkr\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:33 crc kubenswrapper[4773]: I0121 16:50:33.154961 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-catalog-content\") pod \"redhat-operators-42kkr\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:33 crc kubenswrapper[4773]: I0121 16:50:33.155417 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-catalog-content\") pod \"redhat-operators-42kkr\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:33 crc kubenswrapper[4773]: I0121 16:50:33.156610 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-utilities\") pod \"redhat-operators-42kkr\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:33 crc kubenswrapper[4773]: I0121 16:50:33.614549 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq5rt\" (UniqueName: \"kubernetes.io/projected/05b2b505-374c-438a-936e-cc9209f9359a-kube-api-access-tq5rt\") pod \"redhat-operators-42kkr\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:33 crc kubenswrapper[4773]: I0121 16:50:33.633304 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:34 crc kubenswrapper[4773]: I0121 16:50:34.245349 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-42kkr"] Jan 21 16:50:34 crc kubenswrapper[4773]: I0121 16:50:34.788458 4773 generic.go:334] "Generic (PLEG): container finished" podID="05b2b505-374c-438a-936e-cc9209f9359a" containerID="aad087dbd13b92df30760a7d6e7cce3196d0b0a289a516a890fd9bd3e5e63b18" exitCode=0 Jan 21 16:50:34 crc kubenswrapper[4773]: I0121 16:50:34.789218 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42kkr" event={"ID":"05b2b505-374c-438a-936e-cc9209f9359a","Type":"ContainerDied","Data":"aad087dbd13b92df30760a7d6e7cce3196d0b0a289a516a890fd9bd3e5e63b18"} Jan 21 16:50:34 crc kubenswrapper[4773]: I0121 16:50:34.793367 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42kkr" event={"ID":"05b2b505-374c-438a-936e-cc9209f9359a","Type":"ContainerStarted","Data":"41c549b7f0c26eb7632da0d9323af51754cb3ddbb6cef9beb120131cd9f262cc"} Jan 21 16:50:36 crc kubenswrapper[4773]: I0121 16:50:36.811718 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42kkr" event={"ID":"05b2b505-374c-438a-936e-cc9209f9359a","Type":"ContainerStarted","Data":"089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711"} Jan 21 16:50:39 crc kubenswrapper[4773]: I0121 16:50:39.383942 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:50:39 crc kubenswrapper[4773]: E0121 16:50:39.386275 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:50:39 crc kubenswrapper[4773]: I0121 16:50:39.844050 4773 generic.go:334] "Generic (PLEG): container finished" podID="05b2b505-374c-438a-936e-cc9209f9359a" containerID="089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711" exitCode=0 Jan 21 16:50:39 crc kubenswrapper[4773]: I0121 16:50:39.844116 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42kkr" event={"ID":"05b2b505-374c-438a-936e-cc9209f9359a","Type":"ContainerDied","Data":"089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711"} Jan 21 16:50:39 crc kubenswrapper[4773]: I0121 16:50:39.847393 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:50:40 crc kubenswrapper[4773]: I0121 16:50:40.862152 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42kkr" event={"ID":"05b2b505-374c-438a-936e-cc9209f9359a","Type":"ContainerStarted","Data":"f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e"} Jan 21 16:50:40 crc kubenswrapper[4773]: I0121 16:50:40.888026 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-42kkr" podStartSLOduration=3.464871964 podStartE2EDuration="8.888006427s" podCreationTimestamp="2026-01-21 16:50:32 +0000 UTC" firstStartedPulling="2026-01-21 16:50:34.792648767 +0000 UTC m=+5199.717138389" lastFinishedPulling="2026-01-21 16:50:40.21578323 +0000 UTC m=+5205.140272852" observedRunningTime="2026-01-21 16:50:40.880911724 +0000 UTC m=+5205.805401356" watchObservedRunningTime="2026-01-21 16:50:40.888006427 +0000 UTC m=+5205.812496049" Jan 21 16:50:43 crc kubenswrapper[4773]: I0121 16:50:43.633770 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:43 crc kubenswrapper[4773]: I0121 16:50:43.634068 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:44 crc kubenswrapper[4773]: I0121 16:50:44.841146 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-42kkr" podUID="05b2b505-374c-438a-936e-cc9209f9359a" containerName="registry-server" probeResult="failure" output=< Jan 21 16:50:44 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 16:50:44 crc kubenswrapper[4773]: > Jan 21 16:50:50 crc kubenswrapper[4773]: I0121 16:50:50.384023 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:50:50 crc kubenswrapper[4773]: E0121 16:50:50.384829 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:50:53 crc kubenswrapper[4773]: I0121 16:50:53.679374 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:53 crc kubenswrapper[4773]: I0121 16:50:53.733827 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:53 crc kubenswrapper[4773]: I0121 16:50:53.919876 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-42kkr"] Jan 21 16:50:55 crc kubenswrapper[4773]: I0121 16:50:55.008496 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-42kkr" podUID="05b2b505-374c-438a-936e-cc9209f9359a" containerName="registry-server" containerID="cri-o://f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e" gracePeriod=2 Jan 21 16:50:55 crc kubenswrapper[4773]: I0121 16:50:55.716109 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:55 crc kubenswrapper[4773]: I0121 16:50:55.860273 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-utilities\") pod \"05b2b505-374c-438a-936e-cc9209f9359a\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " Jan 21 16:50:55 crc kubenswrapper[4773]: I0121 16:50:55.860353 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-catalog-content\") pod \"05b2b505-374c-438a-936e-cc9209f9359a\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " Jan 21 16:50:55 crc kubenswrapper[4773]: I0121 16:50:55.860491 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq5rt\" (UniqueName: \"kubernetes.io/projected/05b2b505-374c-438a-936e-cc9209f9359a-kube-api-access-tq5rt\") pod \"05b2b505-374c-438a-936e-cc9209f9359a\" (UID: \"05b2b505-374c-438a-936e-cc9209f9359a\") " Jan 21 16:50:55 crc kubenswrapper[4773]: I0121 16:50:55.861117 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-utilities" (OuterVolumeSpecName: "utilities") pod "05b2b505-374c-438a-936e-cc9209f9359a" (UID: "05b2b505-374c-438a-936e-cc9209f9359a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:50:55 crc kubenswrapper[4773]: I0121 16:50:55.861237 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:55 crc kubenswrapper[4773]: I0121 16:50:55.868627 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b2b505-374c-438a-936e-cc9209f9359a-kube-api-access-tq5rt" (OuterVolumeSpecName: "kube-api-access-tq5rt") pod "05b2b505-374c-438a-936e-cc9209f9359a" (UID: "05b2b505-374c-438a-936e-cc9209f9359a"). InnerVolumeSpecName "kube-api-access-tq5rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:50:55 crc kubenswrapper[4773]: I0121 16:50:55.965320 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq5rt\" (UniqueName: \"kubernetes.io/projected/05b2b505-374c-438a-936e-cc9209f9359a-kube-api-access-tq5rt\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:55 crc kubenswrapper[4773]: I0121 16:50:55.995326 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05b2b505-374c-438a-936e-cc9209f9359a" (UID: "05b2b505-374c-438a-936e-cc9209f9359a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.022146 4773 generic.go:334] "Generic (PLEG): container finished" podID="05b2b505-374c-438a-936e-cc9209f9359a" containerID="f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e" exitCode=0 Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.022187 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42kkr" event={"ID":"05b2b505-374c-438a-936e-cc9209f9359a","Type":"ContainerDied","Data":"f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e"} Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.022230 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42kkr" event={"ID":"05b2b505-374c-438a-936e-cc9209f9359a","Type":"ContainerDied","Data":"41c549b7f0c26eb7632da0d9323af51754cb3ddbb6cef9beb120131cd9f262cc"} Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.022248 4773 scope.go:117] "RemoveContainer" containerID="f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e" Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.022411 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42kkr" Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.061840 4773 scope.go:117] "RemoveContainer" containerID="089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711" Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.067246 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b2b505-374c-438a-936e-cc9209f9359a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.088744 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-42kkr"] Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.096057 4773 scope.go:117] "RemoveContainer" containerID="aad087dbd13b92df30760a7d6e7cce3196d0b0a289a516a890fd9bd3e5e63b18" Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.101717 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-42kkr"] Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.140210 4773 scope.go:117] "RemoveContainer" containerID="f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e" Jan 21 16:50:56 crc kubenswrapper[4773]: E0121 16:50:56.140768 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e\": container with ID starting with f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e not found: ID does not exist" containerID="f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e" Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.140829 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e"} err="failed to get container status \"f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e\": rpc error: code = NotFound desc = could not find container \"f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e\": container with ID starting with f0e6b51f06f0c42907fa3e129e894d59539454d1f7cf71e9a674bedd6b565d9e not found: ID does not exist" Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.140857 4773 scope.go:117] "RemoveContainer" containerID="089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711" Jan 21 16:50:56 crc kubenswrapper[4773]: E0121 16:50:56.141197 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711\": container with ID starting with 089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711 not found: ID does not exist" containerID="089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711" Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.141239 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711"} err="failed to get container status \"089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711\": rpc error: code = NotFound desc = could not find container \"089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711\": container with ID starting with 089f269bf8cf31fdbfa57c1104a40a98e5c095258aee9cdc04084a49d5ee9711 not found: ID does not exist" Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.141254 4773 scope.go:117] "RemoveContainer" containerID="aad087dbd13b92df30760a7d6e7cce3196d0b0a289a516a890fd9bd3e5e63b18" Jan 21 16:50:56 crc kubenswrapper[4773]: E0121 16:50:56.141534 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad087dbd13b92df30760a7d6e7cce3196d0b0a289a516a890fd9bd3e5e63b18\": container with ID starting with aad087dbd13b92df30760a7d6e7cce3196d0b0a289a516a890fd9bd3e5e63b18 not found: ID does not exist" containerID="aad087dbd13b92df30760a7d6e7cce3196d0b0a289a516a890fd9bd3e5e63b18" Jan 21 16:50:56 crc kubenswrapper[4773]: I0121 16:50:56.141565 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad087dbd13b92df30760a7d6e7cce3196d0b0a289a516a890fd9bd3e5e63b18"} err="failed to get container status \"aad087dbd13b92df30760a7d6e7cce3196d0b0a289a516a890fd9bd3e5e63b18\": rpc error: code = NotFound desc = could not find container \"aad087dbd13b92df30760a7d6e7cce3196d0b0a289a516a890fd9bd3e5e63b18\": container with ID starting with aad087dbd13b92df30760a7d6e7cce3196d0b0a289a516a890fd9bd3e5e63b18 not found: ID does not exist" Jan 21 16:50:57 crc kubenswrapper[4773]: I0121 16:50:57.395664 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b2b505-374c-438a-936e-cc9209f9359a" path="/var/lib/kubelet/pods/05b2b505-374c-438a-936e-cc9209f9359a/volumes" Jan 21 16:51:03 crc kubenswrapper[4773]: I0121 16:51:03.384176 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:51:03 crc kubenswrapper[4773]: E0121 16:51:03.385108 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.384298 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:51:16 crc kubenswrapper[4773]: E0121 16:51:16.386281 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.547161 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-knm2n"] Jan 21 16:51:16 crc kubenswrapper[4773]: E0121 16:51:16.549287 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b2b505-374c-438a-936e-cc9209f9359a" containerName="registry-server" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.549584 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b2b505-374c-438a-936e-cc9209f9359a" containerName="registry-server" Jan 21 16:51:16 crc kubenswrapper[4773]: E0121 16:51:16.549674 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b2b505-374c-438a-936e-cc9209f9359a" containerName="extract-utilities" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.549791 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b2b505-374c-438a-936e-cc9209f9359a" containerName="extract-utilities" Jan 21 16:51:16 crc kubenswrapper[4773]: E0121 16:51:16.549933 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b2b505-374c-438a-936e-cc9209f9359a" containerName="extract-content" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.550011 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b2b505-374c-438a-936e-cc9209f9359a" containerName="extract-content" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.550739 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="05b2b505-374c-438a-936e-cc9209f9359a" containerName="registry-server" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.554305 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.580882 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-knm2n"] Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.612987 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-catalog-content\") pod \"redhat-marketplace-knm2n\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.613074 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-utilities\") pod \"redhat-marketplace-knm2n\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.613124 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqlk6\" (UniqueName: \"kubernetes.io/projected/11fb9ac2-0628-4ea4-a70d-703801f97902-kube-api-access-gqlk6\") pod \"redhat-marketplace-knm2n\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.715517 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-utilities\") pod \"redhat-marketplace-knm2n\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.715622 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqlk6\" (UniqueName: \"kubernetes.io/projected/11fb9ac2-0628-4ea4-a70d-703801f97902-kube-api-access-gqlk6\") pod \"redhat-marketplace-knm2n\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.715825 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-catalog-content\") pod \"redhat-marketplace-knm2n\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.716369 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-catalog-content\") pod \"redhat-marketplace-knm2n\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.716656 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-utilities\") pod \"redhat-marketplace-knm2n\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.811042 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqlk6\" (UniqueName: \"kubernetes.io/projected/11fb9ac2-0628-4ea4-a70d-703801f97902-kube-api-access-gqlk6\") pod \"redhat-marketplace-knm2n\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:16 crc kubenswrapper[4773]: I0121 16:51:16.885629 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:17 crc kubenswrapper[4773]: I0121 16:51:17.414386 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-knm2n"] Jan 21 16:51:18 crc kubenswrapper[4773]: I0121 16:51:18.243237 4773 generic.go:334] "Generic (PLEG): container finished" podID="11fb9ac2-0628-4ea4-a70d-703801f97902" containerID="dc856be20033ce99b2c8f5313abdfebe81d78d65248537fa3b3d41254412af68" exitCode=0 Jan 21 16:51:18 crc kubenswrapper[4773]: I0121 16:51:18.243451 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-knm2n" event={"ID":"11fb9ac2-0628-4ea4-a70d-703801f97902","Type":"ContainerDied","Data":"dc856be20033ce99b2c8f5313abdfebe81d78d65248537fa3b3d41254412af68"} Jan 21 16:51:18 crc kubenswrapper[4773]: I0121 16:51:18.243553 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-knm2n" event={"ID":"11fb9ac2-0628-4ea4-a70d-703801f97902","Type":"ContainerStarted","Data":"fe7fdf9b49870013f3009b99ae25fd858ece13dcc86f1a769fbec501caace52d"} Jan 21 16:51:19 crc kubenswrapper[4773]: I0121 16:51:19.254624 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-knm2n" event={"ID":"11fb9ac2-0628-4ea4-a70d-703801f97902","Type":"ContainerStarted","Data":"74870e3a405f88cbb49c3408dd0f56f0342052f22e4102272ebbdcc77400a917"} Jan 21 16:51:20 crc kubenswrapper[4773]: I0121 16:51:20.271272 4773 generic.go:334] "Generic (PLEG): container finished" podID="11fb9ac2-0628-4ea4-a70d-703801f97902" containerID="74870e3a405f88cbb49c3408dd0f56f0342052f22e4102272ebbdcc77400a917" exitCode=0 Jan 21 16:51:20 crc kubenswrapper[4773]: I0121 16:51:20.271339 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-knm2n" event={"ID":"11fb9ac2-0628-4ea4-a70d-703801f97902","Type":"ContainerDied","Data":"74870e3a405f88cbb49c3408dd0f56f0342052f22e4102272ebbdcc77400a917"} Jan 21 16:51:21 crc kubenswrapper[4773]: I0121 16:51:21.283712 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-knm2n" event={"ID":"11fb9ac2-0628-4ea4-a70d-703801f97902","Type":"ContainerStarted","Data":"057399d05b99d29b42721462e3863c7d54673e5c3969673df15617cbc1d248c5"} Jan 21 16:51:21 crc kubenswrapper[4773]: I0121 16:51:21.302514 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-knm2n" podStartSLOduration=2.887316096 podStartE2EDuration="5.302495788s" podCreationTimestamp="2026-01-21 16:51:16 +0000 UTC" firstStartedPulling="2026-01-21 16:51:18.245915186 +0000 UTC m=+5243.170404808" lastFinishedPulling="2026-01-21 16:51:20.661094878 +0000 UTC m=+5245.585584500" observedRunningTime="2026-01-21 16:51:21.300036492 +0000 UTC m=+5246.224526114" watchObservedRunningTime="2026-01-21 16:51:21.302495788 +0000 UTC m=+5246.226985410" Jan 21 16:51:26 crc kubenswrapper[4773]: I0121 16:51:26.887367 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:26 crc kubenswrapper[4773]: I0121 16:51:26.888256 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:26 crc kubenswrapper[4773]: I0121 16:51:26.944092 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:27 crc kubenswrapper[4773]: I0121 16:51:27.748491 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:27 crc kubenswrapper[4773]: I0121 16:51:27.794454 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-knm2n"] Jan 21 16:51:29 crc kubenswrapper[4773]: I0121 16:51:29.357459 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-knm2n" podUID="11fb9ac2-0628-4ea4-a70d-703801f97902" containerName="registry-server" containerID="cri-o://057399d05b99d29b42721462e3863c7d54673e5c3969673df15617cbc1d248c5" gracePeriod=2 Jan 21 16:51:29 crc kubenswrapper[4773]: I0121 16:51:29.384477 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:51:29 crc kubenswrapper[4773]: E0121 16:51:29.384720 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:51:30 crc kubenswrapper[4773]: I0121 16:51:30.369138 4773 generic.go:334] "Generic (PLEG): container finished" podID="11fb9ac2-0628-4ea4-a70d-703801f97902" containerID="057399d05b99d29b42721462e3863c7d54673e5c3969673df15617cbc1d248c5" exitCode=0 Jan 21 16:51:30 crc kubenswrapper[4773]: I0121 16:51:30.369197 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-knm2n" event={"ID":"11fb9ac2-0628-4ea4-a70d-703801f97902","Type":"ContainerDied","Data":"057399d05b99d29b42721462e3863c7d54673e5c3969673df15617cbc1d248c5"} Jan 21 16:51:30 crc kubenswrapper[4773]: I0121 16:51:30.882734 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.036934 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqlk6\" (UniqueName: \"kubernetes.io/projected/11fb9ac2-0628-4ea4-a70d-703801f97902-kube-api-access-gqlk6\") pod \"11fb9ac2-0628-4ea4-a70d-703801f97902\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.037069 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-catalog-content\") pod \"11fb9ac2-0628-4ea4-a70d-703801f97902\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.037098 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-utilities\") pod \"11fb9ac2-0628-4ea4-a70d-703801f97902\" (UID: \"11fb9ac2-0628-4ea4-a70d-703801f97902\") " Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.040567 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-utilities" (OuterVolumeSpecName: "utilities") pod "11fb9ac2-0628-4ea4-a70d-703801f97902" (UID: "11fb9ac2-0628-4ea4-a70d-703801f97902"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.043551 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fb9ac2-0628-4ea4-a70d-703801f97902-kube-api-access-gqlk6" (OuterVolumeSpecName: "kube-api-access-gqlk6") pod "11fb9ac2-0628-4ea4-a70d-703801f97902" (UID: "11fb9ac2-0628-4ea4-a70d-703801f97902"). InnerVolumeSpecName "kube-api-access-gqlk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.098101 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11fb9ac2-0628-4ea4-a70d-703801f97902" (UID: "11fb9ac2-0628-4ea4-a70d-703801f97902"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.142308 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.142368 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11fb9ac2-0628-4ea4-a70d-703801f97902-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.142380 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqlk6\" (UniqueName: \"kubernetes.io/projected/11fb9ac2-0628-4ea4-a70d-703801f97902-kube-api-access-gqlk6\") on node \"crc\" DevicePath \"\"" Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.392677 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-knm2n" Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.400263 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-knm2n" event={"ID":"11fb9ac2-0628-4ea4-a70d-703801f97902","Type":"ContainerDied","Data":"fe7fdf9b49870013f3009b99ae25fd858ece13dcc86f1a769fbec501caace52d"} Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.400315 4773 scope.go:117] "RemoveContainer" containerID="057399d05b99d29b42721462e3863c7d54673e5c3969673df15617cbc1d248c5" Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.429685 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-knm2n"] Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.433834 4773 scope.go:117] "RemoveContainer" containerID="74870e3a405f88cbb49c3408dd0f56f0342052f22e4102272ebbdcc77400a917" Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.440098 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-knm2n"] Jan 21 16:51:31 crc kubenswrapper[4773]: I0121 16:51:31.453642 4773 scope.go:117] "RemoveContainer" containerID="dc856be20033ce99b2c8f5313abdfebe81d78d65248537fa3b3d41254412af68" Jan 21 16:51:33 crc kubenswrapper[4773]: I0121 16:51:33.396965 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11fb9ac2-0628-4ea4-a70d-703801f97902" path="/var/lib/kubelet/pods/11fb9ac2-0628-4ea4-a70d-703801f97902/volumes" Jan 21 16:51:41 crc kubenswrapper[4773]: I0121 16:51:41.383507 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:51:41 crc kubenswrapper[4773]: E0121 16:51:41.384264 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:51:52 crc kubenswrapper[4773]: I0121 16:51:52.383871 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:51:52 crc kubenswrapper[4773]: E0121 16:51:52.384432 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:52:06 crc kubenswrapper[4773]: I0121 16:52:06.384098 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:52:06 crc kubenswrapper[4773]: E0121 16:52:06.384921 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:52:21 crc kubenswrapper[4773]: I0121 16:52:21.384052 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:52:21 crc kubenswrapper[4773]: E0121 16:52:21.386121 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:52:34 crc kubenswrapper[4773]: I0121 16:52:34.383825 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:52:34 crc kubenswrapper[4773]: E0121 16:52:34.384669 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:52:45 crc kubenswrapper[4773]: I0121 16:52:45.393379 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:52:45 crc kubenswrapper[4773]: E0121 16:52:45.394147 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:52:59 crc kubenswrapper[4773]: I0121 16:52:59.384176 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:53:00 crc kubenswrapper[4773]: I0121 16:53:00.276963 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"095d92d7a9145764c608090659de50579bfd67c54fb973398e075cbb277923db"} Jan 21 16:55:25 crc kubenswrapper[4773]: I0121 16:55:25.206022 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:55:25 crc kubenswrapper[4773]: I0121 16:55:25.206490 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:55:55 crc kubenswrapper[4773]: I0121 16:55:55.206150 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:55:55 crc kubenswrapper[4773]: I0121 16:55:55.206787 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:56:25 crc kubenswrapper[4773]: I0121 16:56:25.206306 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:56:25 crc kubenswrapper[4773]: I0121 16:56:25.206926 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:56:25 crc kubenswrapper[4773]: I0121 16:56:25.206968 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 16:56:25 crc kubenswrapper[4773]: I0121 16:56:25.207817 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"095d92d7a9145764c608090659de50579bfd67c54fb973398e075cbb277923db"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:56:25 crc kubenswrapper[4773]: I0121 16:56:25.207880 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://095d92d7a9145764c608090659de50579bfd67c54fb973398e075cbb277923db" gracePeriod=600 Jan 21 16:56:26 crc kubenswrapper[4773]: I0121 16:56:26.267263 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="095d92d7a9145764c608090659de50579bfd67c54fb973398e075cbb277923db" exitCode=0 Jan 21 16:56:26 crc kubenswrapper[4773]: I0121 16:56:26.267718 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"095d92d7a9145764c608090659de50579bfd67c54fb973398e075cbb277923db"} Jan 21 16:56:26 crc kubenswrapper[4773]: I0121 16:56:26.267755 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105"} Jan 21 16:56:26 crc kubenswrapper[4773]: I0121 16:56:26.267772 4773 scope.go:117] "RemoveContainer" containerID="5563c661e69d7c61b447a15b4aaa552bf89a2c432bf67537788bab4d9f849668" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.198624 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hphv5"] Jan 21 16:57:38 crc kubenswrapper[4773]: E0121 16:57:38.203284 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fb9ac2-0628-4ea4-a70d-703801f97902" containerName="extract-utilities" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.203322 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fb9ac2-0628-4ea4-a70d-703801f97902" containerName="extract-utilities" Jan 21 16:57:38 crc kubenswrapper[4773]: E0121 16:57:38.203370 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fb9ac2-0628-4ea4-a70d-703801f97902" containerName="registry-server" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.203381 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fb9ac2-0628-4ea4-a70d-703801f97902" containerName="registry-server" Jan 21 16:57:38 crc kubenswrapper[4773]: E0121 16:57:38.203402 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fb9ac2-0628-4ea4-a70d-703801f97902" containerName="extract-content" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.203410 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fb9ac2-0628-4ea4-a70d-703801f97902" containerName="extract-content" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.203723 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="11fb9ac2-0628-4ea4-a70d-703801f97902" containerName="registry-server" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.207798 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.225278 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hphv5"] Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.367152 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zghl\" (UniqueName: \"kubernetes.io/projected/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-kube-api-access-7zghl\") pod \"community-operators-hphv5\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.367363 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-catalog-content\") pod \"community-operators-hphv5\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.367399 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-utilities\") pod \"community-operators-hphv5\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.469975 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-catalog-content\") pod \"community-operators-hphv5\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.470036 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-utilities\") pod \"community-operators-hphv5\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.470166 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zghl\" (UniqueName: \"kubernetes.io/projected/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-kube-api-access-7zghl\") pod \"community-operators-hphv5\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.470644 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-catalog-content\") pod \"community-operators-hphv5\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.470661 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-utilities\") pod \"community-operators-hphv5\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.493885 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zghl\" (UniqueName: \"kubernetes.io/projected/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-kube-api-access-7zghl\") pod \"community-operators-hphv5\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:38 crc kubenswrapper[4773]: I0121 16:57:38.534785 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:39 crc kubenswrapper[4773]: I0121 16:57:39.173128 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hphv5"] Jan 21 16:57:40 crc kubenswrapper[4773]: I0121 16:57:40.001831 4773 generic.go:334] "Generic (PLEG): container finished" podID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" containerID="e21015601b2c00fe4308e8c1da1786e403a51e0c3fcb88b6c93d6b6f9a400bac" exitCode=0 Jan 21 16:57:40 crc kubenswrapper[4773]: I0121 16:57:40.001883 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hphv5" event={"ID":"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666","Type":"ContainerDied","Data":"e21015601b2c00fe4308e8c1da1786e403a51e0c3fcb88b6c93d6b6f9a400bac"} Jan 21 16:57:40 crc kubenswrapper[4773]: I0121 16:57:40.002124 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hphv5" event={"ID":"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666","Type":"ContainerStarted","Data":"a69a79486d9085cfc865eef7fda3cf0b1193d25fdb6ff29ebe2553207949f8a5"} Jan 21 16:57:40 crc kubenswrapper[4773]: I0121 16:57:40.003725 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 16:57:41 crc kubenswrapper[4773]: I0121 16:57:41.013745 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hphv5" event={"ID":"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666","Type":"ContainerStarted","Data":"974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89"} Jan 21 16:57:42 crc kubenswrapper[4773]: I0121 16:57:42.024901 4773 generic.go:334] "Generic (PLEG): container finished" podID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" containerID="974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89" exitCode=0 Jan 21 16:57:42 crc kubenswrapper[4773]: I0121 16:57:42.024955 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hphv5" event={"ID":"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666","Type":"ContainerDied","Data":"974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89"} Jan 21 16:57:43 crc kubenswrapper[4773]: I0121 16:57:43.036811 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hphv5" event={"ID":"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666","Type":"ContainerStarted","Data":"e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74"} Jan 21 16:57:43 crc kubenswrapper[4773]: I0121 16:57:43.062244 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hphv5" podStartSLOduration=2.391323826 podStartE2EDuration="5.062204371s" podCreationTimestamp="2026-01-21 16:57:38 +0000 UTC" firstStartedPulling="2026-01-21 16:57:40.003458881 +0000 UTC m=+5624.927948503" lastFinishedPulling="2026-01-21 16:57:42.674339426 +0000 UTC m=+5627.598829048" observedRunningTime="2026-01-21 16:57:43.050895398 +0000 UTC m=+5627.975385020" watchObservedRunningTime="2026-01-21 16:57:43.062204371 +0000 UTC m=+5627.986694033" Jan 21 16:57:48 crc kubenswrapper[4773]: I0121 16:57:48.535487 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:48 crc kubenswrapper[4773]: I0121 16:57:48.537074 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:48 crc kubenswrapper[4773]: I0121 16:57:48.584705 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:49 crc kubenswrapper[4773]: I0121 16:57:49.139775 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:49 crc kubenswrapper[4773]: I0121 16:57:49.197639 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hphv5"] Jan 21 16:57:51 crc kubenswrapper[4773]: I0121 16:57:51.105038 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hphv5" podUID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" containerName="registry-server" containerID="cri-o://e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74" gracePeriod=2 Jan 21 16:57:51 crc kubenswrapper[4773]: I0121 16:57:51.854188 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:51 crc kubenswrapper[4773]: I0121 16:57:51.971739 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zghl\" (UniqueName: \"kubernetes.io/projected/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-kube-api-access-7zghl\") pod \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " Jan 21 16:57:51 crc kubenswrapper[4773]: I0121 16:57:51.971829 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-catalog-content\") pod \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " Jan 21 16:57:51 crc kubenswrapper[4773]: I0121 16:57:51.972094 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-utilities\") pod \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\" (UID: \"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666\") " Jan 21 16:57:51 crc kubenswrapper[4773]: I0121 16:57:51.973034 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-utilities" (OuterVolumeSpecName: "utilities") pod "a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" (UID: "a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:57:51 crc kubenswrapper[4773]: I0121 16:57:51.977948 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-kube-api-access-7zghl" (OuterVolumeSpecName: "kube-api-access-7zghl") pod "a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" (UID: "a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666"). InnerVolumeSpecName "kube-api-access-7zghl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.028037 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" (UID: "a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.074061 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.074104 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zghl\" (UniqueName: \"kubernetes.io/projected/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-kube-api-access-7zghl\") on node \"crc\" DevicePath \"\"" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.074115 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.116179 4773 generic.go:334] "Generic (PLEG): container finished" podID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" containerID="e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74" exitCode=0 Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.116220 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hphv5" event={"ID":"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666","Type":"ContainerDied","Data":"e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74"} Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.116244 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hphv5" event={"ID":"a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666","Type":"ContainerDied","Data":"a69a79486d9085cfc865eef7fda3cf0b1193d25fdb6ff29ebe2553207949f8a5"} Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.116260 4773 scope.go:117] "RemoveContainer" containerID="e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.116372 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hphv5" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.150959 4773 scope.go:117] "RemoveContainer" containerID="974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.156354 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hphv5"] Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.170040 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hphv5"] Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.177335 4773 scope.go:117] "RemoveContainer" containerID="e21015601b2c00fe4308e8c1da1786e403a51e0c3fcb88b6c93d6b6f9a400bac" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.232830 4773 scope.go:117] "RemoveContainer" containerID="e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74" Jan 21 16:57:52 crc kubenswrapper[4773]: E0121 16:57:52.236092 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74\": container with ID starting with e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74 not found: ID does not exist" containerID="e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.236124 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74"} err="failed to get container status \"e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74\": rpc error: code = NotFound desc = could not find container \"e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74\": container with ID starting with e0ef376e47d34e7df1b43f5441301e119f84cf5cfbe8795ca7751114714ddf74 not found: ID does not exist" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.236145 4773 scope.go:117] "RemoveContainer" containerID="974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89" Jan 21 16:57:52 crc kubenswrapper[4773]: E0121 16:57:52.236427 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89\": container with ID starting with 974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89 not found: ID does not exist" containerID="974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.236530 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89"} err="failed to get container status \"974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89\": rpc error: code = NotFound desc = could not find container \"974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89\": container with ID starting with 974f5dad2f49e9903f437565dfb1987ecbe63a40d539c21f7683a048d13abe89 not found: ID does not exist" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.236564 4773 scope.go:117] "RemoveContainer" containerID="e21015601b2c00fe4308e8c1da1786e403a51e0c3fcb88b6c93d6b6f9a400bac" Jan 21 16:57:52 crc kubenswrapper[4773]: E0121 16:57:52.236854 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e21015601b2c00fe4308e8c1da1786e403a51e0c3fcb88b6c93d6b6f9a400bac\": container with ID starting with e21015601b2c00fe4308e8c1da1786e403a51e0c3fcb88b6c93d6b6f9a400bac not found: ID does not exist" containerID="e21015601b2c00fe4308e8c1da1786e403a51e0c3fcb88b6c93d6b6f9a400bac" Jan 21 16:57:52 crc kubenswrapper[4773]: I0121 16:57:52.236881 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e21015601b2c00fe4308e8c1da1786e403a51e0c3fcb88b6c93d6b6f9a400bac"} err="failed to get container status \"e21015601b2c00fe4308e8c1da1786e403a51e0c3fcb88b6c93d6b6f9a400bac\": rpc error: code = NotFound desc = could not find container \"e21015601b2c00fe4308e8c1da1786e403a51e0c3fcb88b6c93d6b6f9a400bac\": container with ID starting with e21015601b2c00fe4308e8c1da1786e403a51e0c3fcb88b6c93d6b6f9a400bac not found: ID does not exist" Jan 21 16:57:53 crc kubenswrapper[4773]: I0121 16:57:53.399073 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" path="/var/lib/kubelet/pods/a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666/volumes" Jan 21 16:58:25 crc kubenswrapper[4773]: I0121 16:58:25.205878 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:58:25 crc kubenswrapper[4773]: I0121 16:58:25.206433 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:58:55 crc kubenswrapper[4773]: I0121 16:58:55.205911 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:58:55 crc kubenswrapper[4773]: I0121 16:58:55.206433 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:59:25 crc kubenswrapper[4773]: I0121 16:59:25.206139 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 16:59:25 crc kubenswrapper[4773]: I0121 16:59:25.206648 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 16:59:25 crc kubenswrapper[4773]: I0121 16:59:25.206686 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 16:59:25 crc kubenswrapper[4773]: I0121 16:59:25.207427 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 16:59:25 crc kubenswrapper[4773]: I0121 16:59:25.207469 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" gracePeriod=600 Jan 21 16:59:25 crc kubenswrapper[4773]: E0121 16:59:25.331308 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:59:26 crc kubenswrapper[4773]: I0121 16:59:26.175073 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" exitCode=0 Jan 21 16:59:26 crc kubenswrapper[4773]: I0121 16:59:26.175436 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105"} Jan 21 16:59:26 crc kubenswrapper[4773]: I0121 16:59:26.175567 4773 scope.go:117] "RemoveContainer" containerID="095d92d7a9145764c608090659de50579bfd67c54fb973398e075cbb277923db" Jan 21 16:59:26 crc kubenswrapper[4773]: I0121 16:59:26.176546 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 16:59:26 crc kubenswrapper[4773]: E0121 16:59:26.176986 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:59:41 crc kubenswrapper[4773]: I0121 16:59:41.383732 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 16:59:41 crc kubenswrapper[4773]: E0121 16:59:41.384409 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 16:59:54 crc kubenswrapper[4773]: I0121 16:59:54.384139 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 16:59:54 crc kubenswrapper[4773]: E0121 16:59:54.384894 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.180024 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj"] Jan 21 17:00:00 crc kubenswrapper[4773]: E0121 17:00:00.181028 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" containerName="extract-utilities" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.181046 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" containerName="extract-utilities" Jan 21 17:00:00 crc kubenswrapper[4773]: E0121 17:00:00.181059 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" containerName="registry-server" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.181067 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" containerName="registry-server" Jan 21 17:00:00 crc kubenswrapper[4773]: E0121 17:00:00.181111 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" containerName="extract-content" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.181119 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" containerName="extract-content" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.181358 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f7c7f2-6262-4a8d-b6de-dc5a30cb3666" containerName="registry-server" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.182288 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.187336 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.187681 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.190818 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj"] Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.333152 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-secret-volume\") pod \"collect-profiles-29483580-269lj\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.333206 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbwg\" (UniqueName: \"kubernetes.io/projected/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-kube-api-access-pdbwg\") pod \"collect-profiles-29483580-269lj\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.333437 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-config-volume\") pod \"collect-profiles-29483580-269lj\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.435371 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-config-volume\") pod \"collect-profiles-29483580-269lj\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.435529 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-secret-volume\") pod \"collect-profiles-29483580-269lj\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.435547 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbwg\" (UniqueName: \"kubernetes.io/projected/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-kube-api-access-pdbwg\") pod \"collect-profiles-29483580-269lj\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.436577 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-config-volume\") pod \"collect-profiles-29483580-269lj\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.445379 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-secret-volume\") pod \"collect-profiles-29483580-269lj\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.453215 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbwg\" (UniqueName: \"kubernetes.io/projected/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-kube-api-access-pdbwg\") pod \"collect-profiles-29483580-269lj\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:00 crc kubenswrapper[4773]: I0121 17:00:00.507285 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:01 crc kubenswrapper[4773]: I0121 17:00:01.026544 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj"] Jan 21 17:00:01 crc kubenswrapper[4773]: I0121 17:00:01.486090 4773 generic.go:334] "Generic (PLEG): container finished" podID="97d17e7d-d3be-4cfc-b4a1-520989cdc10e" containerID="50a728cc2638a11b7aac301d1c3b69f9b6e7e79800d5df6638f7d6a9eb4afe5d" exitCode=0 Jan 21 17:00:01 crc kubenswrapper[4773]: I0121 17:00:01.486142 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" event={"ID":"97d17e7d-d3be-4cfc-b4a1-520989cdc10e","Type":"ContainerDied","Data":"50a728cc2638a11b7aac301d1c3b69f9b6e7e79800d5df6638f7d6a9eb4afe5d"} Jan 21 17:00:01 crc kubenswrapper[4773]: I0121 17:00:01.486478 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" event={"ID":"97d17e7d-d3be-4cfc-b4a1-520989cdc10e","Type":"ContainerStarted","Data":"7772bedf5344c10dc8fe868cb73900531e5a60c7b662ba8a35b5a7408c3a91b7"} Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.074500 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.197583 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdbwg\" (UniqueName: \"kubernetes.io/projected/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-kube-api-access-pdbwg\") pod \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.198141 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-config-volume\") pod \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.198211 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-secret-volume\") pod \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\" (UID: \"97d17e7d-d3be-4cfc-b4a1-520989cdc10e\") " Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.198772 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-config-volume" (OuterVolumeSpecName: "config-volume") pod "97d17e7d-d3be-4cfc-b4a1-520989cdc10e" (UID: "97d17e7d-d3be-4cfc-b4a1-520989cdc10e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.199038 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.207263 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "97d17e7d-d3be-4cfc-b4a1-520989cdc10e" (UID: "97d17e7d-d3be-4cfc-b4a1-520989cdc10e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.207288 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-kube-api-access-pdbwg" (OuterVolumeSpecName: "kube-api-access-pdbwg") pod "97d17e7d-d3be-4cfc-b4a1-520989cdc10e" (UID: "97d17e7d-d3be-4cfc-b4a1-520989cdc10e"). InnerVolumeSpecName "kube-api-access-pdbwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.301003 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.301048 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdbwg\" (UniqueName: \"kubernetes.io/projected/97d17e7d-d3be-4cfc-b4a1-520989cdc10e-kube-api-access-pdbwg\") on node \"crc\" DevicePath \"\"" Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.504548 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" event={"ID":"97d17e7d-d3be-4cfc-b4a1-520989cdc10e","Type":"ContainerDied","Data":"7772bedf5344c10dc8fe868cb73900531e5a60c7b662ba8a35b5a7408c3a91b7"} Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.504589 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7772bedf5344c10dc8fe868cb73900531e5a60c7b662ba8a35b5a7408c3a91b7" Jan 21 17:00:03 crc kubenswrapper[4773]: I0121 17:00:03.504638 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483580-269lj" Jan 21 17:00:04 crc kubenswrapper[4773]: I0121 17:00:04.152721 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6"] Jan 21 17:00:04 crc kubenswrapper[4773]: I0121 17:00:04.162178 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483535-znvl6"] Jan 21 17:00:05 crc kubenswrapper[4773]: I0121 17:00:05.396201 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd839d1-7464-4060-98c8-b22f706471ff" path="/var/lib/kubelet/pods/ffd839d1-7464-4060-98c8-b22f706471ff/volumes" Jan 21 17:00:09 crc kubenswrapper[4773]: I0121 17:00:09.383644 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:00:09 crc kubenswrapper[4773]: E0121 17:00:09.384235 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:00:20 crc kubenswrapper[4773]: I0121 17:00:20.384104 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:00:20 crc kubenswrapper[4773]: E0121 17:00:20.384922 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:00:34 crc kubenswrapper[4773]: I0121 17:00:34.383612 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:00:34 crc kubenswrapper[4773]: E0121 17:00:34.384296 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:00:36 crc kubenswrapper[4773]: I0121 17:00:36.485753 4773 scope.go:117] "RemoveContainer" containerID="60df97948b8ad659b2694b77b9238efff19c2705a954b78a911970aeb9b459e6" Jan 21 17:00:45 crc kubenswrapper[4773]: I0121 17:00:45.394209 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:00:45 crc kubenswrapper[4773]: E0121 17:00:45.395641 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:00:59 crc kubenswrapper[4773]: I0121 17:00:59.383583 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:00:59 crc kubenswrapper[4773]: E0121 17:00:59.385573 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.153499 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29483581-4nldh"] Jan 21 17:01:00 crc kubenswrapper[4773]: E0121 17:01:00.154253 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d17e7d-d3be-4cfc-b4a1-520989cdc10e" containerName="collect-profiles" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.154329 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d17e7d-d3be-4cfc-b4a1-520989cdc10e" containerName="collect-profiles" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.154644 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d17e7d-d3be-4cfc-b4a1-520989cdc10e" containerName="collect-profiles" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.155455 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.165556 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29483581-4nldh"] Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.237136 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-combined-ca-bundle\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.237420 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-config-data\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.237463 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-fernet-keys\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.237559 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvlzk\" (UniqueName: \"kubernetes.io/projected/614a754c-6d1d-4b06-8777-dfbd36e860b4-kube-api-access-dvlzk\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.339355 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvlzk\" (UniqueName: \"kubernetes.io/projected/614a754c-6d1d-4b06-8777-dfbd36e860b4-kube-api-access-dvlzk\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.339456 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-combined-ca-bundle\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.339528 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-config-data\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.339577 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-fernet-keys\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.346444 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-fernet-keys\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.349493 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-config-data\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.355325 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-combined-ca-bundle\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.357648 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvlzk\" (UniqueName: \"kubernetes.io/projected/614a754c-6d1d-4b06-8777-dfbd36e860b4-kube-api-access-dvlzk\") pod \"keystone-cron-29483581-4nldh\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.485797 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:00 crc kubenswrapper[4773]: I0121 17:01:00.952426 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29483581-4nldh"] Jan 21 17:01:01 crc kubenswrapper[4773]: I0121 17:01:01.135556 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483581-4nldh" event={"ID":"614a754c-6d1d-4b06-8777-dfbd36e860b4","Type":"ContainerStarted","Data":"101fddce6227218095340139c454a8d7e96e8afbb7433f676c137e9c0dd1b89e"} Jan 21 17:01:02 crc kubenswrapper[4773]: I0121 17:01:02.147033 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483581-4nldh" event={"ID":"614a754c-6d1d-4b06-8777-dfbd36e860b4","Type":"ContainerStarted","Data":"fc20b3ca0a6901967d670e17b2e23246e8d1ab37f60b8f9cd6f69c4bfb1f439a"} Jan 21 17:01:02 crc kubenswrapper[4773]: I0121 17:01:02.177048 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29483581-4nldh" podStartSLOduration=2.177026516 podStartE2EDuration="2.177026516s" podCreationTimestamp="2026-01-21 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 17:01:02.163618467 +0000 UTC m=+5827.088108099" watchObservedRunningTime="2026-01-21 17:01:02.177026516 +0000 UTC m=+5827.101516138" Jan 21 17:01:08 crc kubenswrapper[4773]: I0121 17:01:08.199262 4773 generic.go:334] "Generic (PLEG): container finished" podID="614a754c-6d1d-4b06-8777-dfbd36e860b4" containerID="fc20b3ca0a6901967d670e17b2e23246e8d1ab37f60b8f9cd6f69c4bfb1f439a" exitCode=0 Jan 21 17:01:08 crc kubenswrapper[4773]: I0121 17:01:08.199357 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483581-4nldh" event={"ID":"614a754c-6d1d-4b06-8777-dfbd36e860b4","Type":"ContainerDied","Data":"fc20b3ca0a6901967d670e17b2e23246e8d1ab37f60b8f9cd6f69c4bfb1f439a"} Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.783276 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.842091 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-fernet-keys\") pod \"614a754c-6d1d-4b06-8777-dfbd36e860b4\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.842552 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-combined-ca-bundle\") pod \"614a754c-6d1d-4b06-8777-dfbd36e860b4\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.842670 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvlzk\" (UniqueName: \"kubernetes.io/projected/614a754c-6d1d-4b06-8777-dfbd36e860b4-kube-api-access-dvlzk\") pod \"614a754c-6d1d-4b06-8777-dfbd36e860b4\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.842731 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-config-data\") pod \"614a754c-6d1d-4b06-8777-dfbd36e860b4\" (UID: \"614a754c-6d1d-4b06-8777-dfbd36e860b4\") " Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.850156 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/614a754c-6d1d-4b06-8777-dfbd36e860b4-kube-api-access-dvlzk" (OuterVolumeSpecName: "kube-api-access-dvlzk") pod "614a754c-6d1d-4b06-8777-dfbd36e860b4" (UID: "614a754c-6d1d-4b06-8777-dfbd36e860b4"). InnerVolumeSpecName "kube-api-access-dvlzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.851799 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "614a754c-6d1d-4b06-8777-dfbd36e860b4" (UID: "614a754c-6d1d-4b06-8777-dfbd36e860b4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.873488 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "614a754c-6d1d-4b06-8777-dfbd36e860b4" (UID: "614a754c-6d1d-4b06-8777-dfbd36e860b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.907933 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-config-data" (OuterVolumeSpecName: "config-data") pod "614a754c-6d1d-4b06-8777-dfbd36e860b4" (UID: "614a754c-6d1d-4b06-8777-dfbd36e860b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.945050 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.945093 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.945112 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvlzk\" (UniqueName: \"kubernetes.io/projected/614a754c-6d1d-4b06-8777-dfbd36e860b4-kube-api-access-dvlzk\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:09 crc kubenswrapper[4773]: I0121 17:01:09.945124 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614a754c-6d1d-4b06-8777-dfbd36e860b4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:10 crc kubenswrapper[4773]: I0121 17:01:10.219981 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483581-4nldh" event={"ID":"614a754c-6d1d-4b06-8777-dfbd36e860b4","Type":"ContainerDied","Data":"101fddce6227218095340139c454a8d7e96e8afbb7433f676c137e9c0dd1b89e"} Jan 21 17:01:10 crc kubenswrapper[4773]: I0121 17:01:10.220026 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="101fddce6227218095340139c454a8d7e96e8afbb7433f676c137e9c0dd1b89e" Jan 21 17:01:10 crc kubenswrapper[4773]: I0121 17:01:10.220081 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483581-4nldh" Jan 21 17:01:14 crc kubenswrapper[4773]: I0121 17:01:14.384146 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:01:14 crc kubenswrapper[4773]: E0121 17:01:14.384968 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:01:27 crc kubenswrapper[4773]: I0121 17:01:27.387347 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:01:27 crc kubenswrapper[4773]: E0121 17:01:27.388106 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.279933 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hk277"] Jan 21 17:01:28 crc kubenswrapper[4773]: E0121 17:01:28.280705 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614a754c-6d1d-4b06-8777-dfbd36e860b4" containerName="keystone-cron" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.280725 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="614a754c-6d1d-4b06-8777-dfbd36e860b4" containerName="keystone-cron" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.281000 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="614a754c-6d1d-4b06-8777-dfbd36e860b4" containerName="keystone-cron" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.282857 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.290714 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk277"] Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.412658 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-catalog-content\") pod \"redhat-marketplace-hk277\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.412837 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-utilities\") pod \"redhat-marketplace-hk277\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.412870 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nhkl\" (UniqueName: \"kubernetes.io/projected/deea5351-f20f-4c85-89e6-5bf7f1f63847-kube-api-access-8nhkl\") pod \"redhat-marketplace-hk277\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.514555 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-utilities\") pod \"redhat-marketplace-hk277\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.514600 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nhkl\" (UniqueName: \"kubernetes.io/projected/deea5351-f20f-4c85-89e6-5bf7f1f63847-kube-api-access-8nhkl\") pod \"redhat-marketplace-hk277\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.514751 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-catalog-content\") pod \"redhat-marketplace-hk277\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.515108 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-utilities\") pod \"redhat-marketplace-hk277\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.515256 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-catalog-content\") pod \"redhat-marketplace-hk277\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.534542 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nhkl\" (UniqueName: \"kubernetes.io/projected/deea5351-f20f-4c85-89e6-5bf7f1f63847-kube-api-access-8nhkl\") pod \"redhat-marketplace-hk277\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:28 crc kubenswrapper[4773]: I0121 17:01:28.613068 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:29 crc kubenswrapper[4773]: I0121 17:01:29.122116 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk277"] Jan 21 17:01:29 crc kubenswrapper[4773]: I0121 17:01:29.394162 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk277" event={"ID":"deea5351-f20f-4c85-89e6-5bf7f1f63847","Type":"ContainerStarted","Data":"f9ae6c90bc1aca30734a75def3fa83b9d2f083e4fa40851848734d87c0fa4854"} Jan 21 17:01:30 crc kubenswrapper[4773]: I0121 17:01:30.400061 4773 generic.go:334] "Generic (PLEG): container finished" podID="deea5351-f20f-4c85-89e6-5bf7f1f63847" containerID="1bd2bfa32fbacdb72f811750df7703bca390d71e640ce17f3263317f58a08816" exitCode=0 Jan 21 17:01:30 crc kubenswrapper[4773]: I0121 17:01:30.400122 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk277" event={"ID":"deea5351-f20f-4c85-89e6-5bf7f1f63847","Type":"ContainerDied","Data":"1bd2bfa32fbacdb72f811750df7703bca390d71e640ce17f3263317f58a08816"} Jan 21 17:01:32 crc kubenswrapper[4773]: I0121 17:01:32.475213 4773 generic.go:334] "Generic (PLEG): container finished" podID="deea5351-f20f-4c85-89e6-5bf7f1f63847" containerID="6a5d48d667f961d4bcd15f551fee4dfb44c90050e4a8fd1c994faa0e8f5c30be" exitCode=0 Jan 21 17:01:32 crc kubenswrapper[4773]: I0121 17:01:32.475334 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk277" event={"ID":"deea5351-f20f-4c85-89e6-5bf7f1f63847","Type":"ContainerDied","Data":"6a5d48d667f961d4bcd15f551fee4dfb44c90050e4a8fd1c994faa0e8f5c30be"} Jan 21 17:01:34 crc kubenswrapper[4773]: I0121 17:01:34.495042 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk277" event={"ID":"deea5351-f20f-4c85-89e6-5bf7f1f63847","Type":"ContainerStarted","Data":"b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901"} Jan 21 17:01:34 crc kubenswrapper[4773]: I0121 17:01:34.526557 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hk277" podStartSLOduration=3.896907791 podStartE2EDuration="6.52653229s" podCreationTimestamp="2026-01-21 17:01:28 +0000 UTC" firstStartedPulling="2026-01-21 17:01:30.402477007 +0000 UTC m=+5855.326966629" lastFinishedPulling="2026-01-21 17:01:33.032101496 +0000 UTC m=+5857.956591128" observedRunningTime="2026-01-21 17:01:34.515278838 +0000 UTC m=+5859.439768470" watchObservedRunningTime="2026-01-21 17:01:34.52653229 +0000 UTC m=+5859.451021912" Jan 21 17:01:38 crc kubenswrapper[4773]: I0121 17:01:38.613541 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:38 crc kubenswrapper[4773]: I0121 17:01:38.615325 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:38 crc kubenswrapper[4773]: I0121 17:01:38.670839 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:39 crc kubenswrapper[4773]: I0121 17:01:39.601101 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:39 crc kubenswrapper[4773]: I0121 17:01:39.652743 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk277"] Jan 21 17:01:40 crc kubenswrapper[4773]: I0121 17:01:40.384894 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:01:40 crc kubenswrapper[4773]: E0121 17:01:40.385180 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:01:41 crc kubenswrapper[4773]: I0121 17:01:41.568611 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hk277" podUID="deea5351-f20f-4c85-89e6-5bf7f1f63847" containerName="registry-server" containerID="cri-o://b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901" gracePeriod=2 Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.302211 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.417037 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-utilities\") pod \"deea5351-f20f-4c85-89e6-5bf7f1f63847\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.417192 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nhkl\" (UniqueName: \"kubernetes.io/projected/deea5351-f20f-4c85-89e6-5bf7f1f63847-kube-api-access-8nhkl\") pod \"deea5351-f20f-4c85-89e6-5bf7f1f63847\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.417351 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-catalog-content\") pod \"deea5351-f20f-4c85-89e6-5bf7f1f63847\" (UID: \"deea5351-f20f-4c85-89e6-5bf7f1f63847\") " Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.419862 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-utilities" (OuterVolumeSpecName: "utilities") pod "deea5351-f20f-4c85-89e6-5bf7f1f63847" (UID: "deea5351-f20f-4c85-89e6-5bf7f1f63847"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.429109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deea5351-f20f-4c85-89e6-5bf7f1f63847-kube-api-access-8nhkl" (OuterVolumeSpecName: "kube-api-access-8nhkl") pod "deea5351-f20f-4c85-89e6-5bf7f1f63847" (UID: "deea5351-f20f-4c85-89e6-5bf7f1f63847"). InnerVolumeSpecName "kube-api-access-8nhkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.439804 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deea5351-f20f-4c85-89e6-5bf7f1f63847" (UID: "deea5351-f20f-4c85-89e6-5bf7f1f63847"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.520460 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.520800 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nhkl\" (UniqueName: \"kubernetes.io/projected/deea5351-f20f-4c85-89e6-5bf7f1f63847-kube-api-access-8nhkl\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.520817 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deea5351-f20f-4c85-89e6-5bf7f1f63847-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.591941 4773 generic.go:334] "Generic (PLEG): container finished" podID="deea5351-f20f-4c85-89e6-5bf7f1f63847" containerID="b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901" exitCode=0 Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.591984 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk277" event={"ID":"deea5351-f20f-4c85-89e6-5bf7f1f63847","Type":"ContainerDied","Data":"b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901"} Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.592015 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk277" event={"ID":"deea5351-f20f-4c85-89e6-5bf7f1f63847","Type":"ContainerDied","Data":"f9ae6c90bc1aca30734a75def3fa83b9d2f083e4fa40851848734d87c0fa4854"} Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.592021 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk277" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.592046 4773 scope.go:117] "RemoveContainer" containerID="b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.636036 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk277"] Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.641145 4773 scope.go:117] "RemoveContainer" containerID="6a5d48d667f961d4bcd15f551fee4dfb44c90050e4a8fd1c994faa0e8f5c30be" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.659650 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk277"] Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.673917 4773 scope.go:117] "RemoveContainer" containerID="1bd2bfa32fbacdb72f811750df7703bca390d71e640ce17f3263317f58a08816" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.724888 4773 scope.go:117] "RemoveContainer" containerID="b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901" Jan 21 17:01:42 crc kubenswrapper[4773]: E0121 17:01:42.725293 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901\": container with ID starting with b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901 not found: ID does not exist" containerID="b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.725324 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901"} err="failed to get container status \"b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901\": rpc error: code = NotFound desc = could not find container \"b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901\": container with ID starting with b7105a134200434ef3a21a9b64e35c8cec1d5756771d20d863b83d4cbc891901 not found: ID does not exist" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.725344 4773 scope.go:117] "RemoveContainer" containerID="6a5d48d667f961d4bcd15f551fee4dfb44c90050e4a8fd1c994faa0e8f5c30be" Jan 21 17:01:42 crc kubenswrapper[4773]: E0121 17:01:42.725704 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a5d48d667f961d4bcd15f551fee4dfb44c90050e4a8fd1c994faa0e8f5c30be\": container with ID starting with 6a5d48d667f961d4bcd15f551fee4dfb44c90050e4a8fd1c994faa0e8f5c30be not found: ID does not exist" containerID="6a5d48d667f961d4bcd15f551fee4dfb44c90050e4a8fd1c994faa0e8f5c30be" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.725730 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a5d48d667f961d4bcd15f551fee4dfb44c90050e4a8fd1c994faa0e8f5c30be"} err="failed to get container status \"6a5d48d667f961d4bcd15f551fee4dfb44c90050e4a8fd1c994faa0e8f5c30be\": rpc error: code = NotFound desc = could not find container \"6a5d48d667f961d4bcd15f551fee4dfb44c90050e4a8fd1c994faa0e8f5c30be\": container with ID starting with 6a5d48d667f961d4bcd15f551fee4dfb44c90050e4a8fd1c994faa0e8f5c30be not found: ID does not exist" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.725746 4773 scope.go:117] "RemoveContainer" containerID="1bd2bfa32fbacdb72f811750df7703bca390d71e640ce17f3263317f58a08816" Jan 21 17:01:42 crc kubenswrapper[4773]: E0121 17:01:42.726021 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd2bfa32fbacdb72f811750df7703bca390d71e640ce17f3263317f58a08816\": container with ID starting with 1bd2bfa32fbacdb72f811750df7703bca390d71e640ce17f3263317f58a08816 not found: ID does not exist" containerID="1bd2bfa32fbacdb72f811750df7703bca390d71e640ce17f3263317f58a08816" Jan 21 17:01:42 crc kubenswrapper[4773]: I0121 17:01:42.726077 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd2bfa32fbacdb72f811750df7703bca390d71e640ce17f3263317f58a08816"} err="failed to get container status \"1bd2bfa32fbacdb72f811750df7703bca390d71e640ce17f3263317f58a08816\": rpc error: code = NotFound desc = could not find container \"1bd2bfa32fbacdb72f811750df7703bca390d71e640ce17f3263317f58a08816\": container with ID starting with 1bd2bfa32fbacdb72f811750df7703bca390d71e640ce17f3263317f58a08816 not found: ID does not exist" Jan 21 17:01:43 crc kubenswrapper[4773]: I0121 17:01:43.397238 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deea5351-f20f-4c85-89e6-5bf7f1f63847" path="/var/lib/kubelet/pods/deea5351-f20f-4c85-89e6-5bf7f1f63847/volumes" Jan 21 17:01:54 crc kubenswrapper[4773]: I0121 17:01:54.383860 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:01:54 crc kubenswrapper[4773]: E0121 17:01:54.384619 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:02:05 crc kubenswrapper[4773]: I0121 17:02:05.391118 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:02:05 crc kubenswrapper[4773]: E0121 17:02:05.391969 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:02:16 crc kubenswrapper[4773]: I0121 17:02:16.383814 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:02:16 crc kubenswrapper[4773]: E0121 17:02:16.384801 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:02:28 crc kubenswrapper[4773]: I0121 17:02:28.385565 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:02:28 crc kubenswrapper[4773]: E0121 17:02:28.386246 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:02:39 crc kubenswrapper[4773]: I0121 17:02:39.386894 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:02:39 crc kubenswrapper[4773]: E0121 17:02:39.387589 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:02:52 crc kubenswrapper[4773]: I0121 17:02:52.384629 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:02:52 crc kubenswrapper[4773]: E0121 17:02:52.385472 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:03:03 crc kubenswrapper[4773]: I0121 17:03:03.383939 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:03:03 crc kubenswrapper[4773]: E0121 17:03:03.384856 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:03:16 crc kubenswrapper[4773]: I0121 17:03:16.383572 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:03:16 crc kubenswrapper[4773]: E0121 17:03:16.384475 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:03:29 crc kubenswrapper[4773]: I0121 17:03:29.384349 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:03:29 crc kubenswrapper[4773]: E0121 17:03:29.385233 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:03:40 crc kubenswrapper[4773]: I0121 17:03:40.384041 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:03:40 crc kubenswrapper[4773]: E0121 17:03:40.384976 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:03:53 crc kubenswrapper[4773]: I0121 17:03:53.387312 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:03:53 crc kubenswrapper[4773]: E0121 17:03:53.388505 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:04:06 crc kubenswrapper[4773]: I0121 17:04:06.384494 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:04:06 crc kubenswrapper[4773]: E0121 17:04:06.385161 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:04:20 crc kubenswrapper[4773]: I0121 17:04:20.384360 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:04:20 crc kubenswrapper[4773]: E0121 17:04:20.385205 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:04:31 crc kubenswrapper[4773]: I0121 17:04:31.383520 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:04:32 crc kubenswrapper[4773]: I0121 17:04:32.314666 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"f58d85080097b79fe7e96de02a586d87703ae58f21ea5a84fd022f30d8e66c7f"} Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.048060 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8f9kw"] Jan 21 17:04:52 crc kubenswrapper[4773]: E0121 17:04:52.049152 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deea5351-f20f-4c85-89e6-5bf7f1f63847" containerName="extract-content" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.049173 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="deea5351-f20f-4c85-89e6-5bf7f1f63847" containerName="extract-content" Jan 21 17:04:52 crc kubenswrapper[4773]: E0121 17:04:52.049193 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deea5351-f20f-4c85-89e6-5bf7f1f63847" containerName="extract-utilities" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.049201 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="deea5351-f20f-4c85-89e6-5bf7f1f63847" containerName="extract-utilities" Jan 21 17:04:52 crc kubenswrapper[4773]: E0121 17:04:52.049237 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deea5351-f20f-4c85-89e6-5bf7f1f63847" containerName="registry-server" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.049245 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="deea5351-f20f-4c85-89e6-5bf7f1f63847" containerName="registry-server" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.049504 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="deea5351-f20f-4c85-89e6-5bf7f1f63847" containerName="registry-server" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.051793 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.065727 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8f9kw"] Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.115658 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-utilities\") pod \"certified-operators-8f9kw\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.115889 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-catalog-content\") pod \"certified-operators-8f9kw\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.115934 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkqnh\" (UniqueName: \"kubernetes.io/projected/b7ca1119-312b-43c8-91b6-798ef6b132fd-kube-api-access-jkqnh\") pod \"certified-operators-8f9kw\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.217718 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-utilities\") pod \"certified-operators-8f9kw\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.217858 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-catalog-content\") pod \"certified-operators-8f9kw\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.217882 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkqnh\" (UniqueName: \"kubernetes.io/projected/b7ca1119-312b-43c8-91b6-798ef6b132fd-kube-api-access-jkqnh\") pod \"certified-operators-8f9kw\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.218321 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-utilities\") pod \"certified-operators-8f9kw\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.218360 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-catalog-content\") pod \"certified-operators-8f9kw\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.237662 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkqnh\" (UniqueName: \"kubernetes.io/projected/b7ca1119-312b-43c8-91b6-798ef6b132fd-kube-api-access-jkqnh\") pod \"certified-operators-8f9kw\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:04:52 crc kubenswrapper[4773]: I0121 17:04:52.373882 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:04:53 crc kubenswrapper[4773]: I0121 17:04:53.049566 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8f9kw"] Jan 21 17:04:53 crc kubenswrapper[4773]: I0121 17:04:53.520010 4773 generic.go:334] "Generic (PLEG): container finished" podID="b7ca1119-312b-43c8-91b6-798ef6b132fd" containerID="9cba79d9e1aeb74b1f1d65a14966cefb2ebf805d89216583f700f7bf5c224228" exitCode=0 Jan 21 17:04:53 crc kubenswrapper[4773]: I0121 17:04:53.520071 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f9kw" event={"ID":"b7ca1119-312b-43c8-91b6-798ef6b132fd","Type":"ContainerDied","Data":"9cba79d9e1aeb74b1f1d65a14966cefb2ebf805d89216583f700f7bf5c224228"} Jan 21 17:04:53 crc kubenswrapper[4773]: I0121 17:04:53.520333 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f9kw" event={"ID":"b7ca1119-312b-43c8-91b6-798ef6b132fd","Type":"ContainerStarted","Data":"7c3d3d46c0a05600c78ffe99baa9188d124bcd80ec654cf8aec496e546e41cb4"} Jan 21 17:04:53 crc kubenswrapper[4773]: I0121 17:04:53.522253 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 17:04:54 crc kubenswrapper[4773]: I0121 17:04:54.530913 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f9kw" event={"ID":"b7ca1119-312b-43c8-91b6-798ef6b132fd","Type":"ContainerStarted","Data":"bb8dfba8b71b0afb3eb13b1800d760146104a8b9df184f1f2effe836716d73f9"} Jan 21 17:04:58 crc kubenswrapper[4773]: I0121 17:04:58.567966 4773 generic.go:334] "Generic (PLEG): container finished" podID="b7ca1119-312b-43c8-91b6-798ef6b132fd" containerID="bb8dfba8b71b0afb3eb13b1800d760146104a8b9df184f1f2effe836716d73f9" exitCode=0 Jan 21 17:04:58 crc kubenswrapper[4773]: I0121 17:04:58.568033 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f9kw" event={"ID":"b7ca1119-312b-43c8-91b6-798ef6b132fd","Type":"ContainerDied","Data":"bb8dfba8b71b0afb3eb13b1800d760146104a8b9df184f1f2effe836716d73f9"} Jan 21 17:04:59 crc kubenswrapper[4773]: I0121 17:04:59.588502 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f9kw" event={"ID":"b7ca1119-312b-43c8-91b6-798ef6b132fd","Type":"ContainerStarted","Data":"30a811fb1970bf887d3b016865f7e73d709336b3f312d105ae0279d2b067f4d1"} Jan 21 17:04:59 crc kubenswrapper[4773]: I0121 17:04:59.612937 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8f9kw" podStartSLOduration=2.140259313 podStartE2EDuration="7.612918926s" podCreationTimestamp="2026-01-21 17:04:52 +0000 UTC" firstStartedPulling="2026-01-21 17:04:53.5220163 +0000 UTC m=+6058.446505922" lastFinishedPulling="2026-01-21 17:04:58.994675903 +0000 UTC m=+6063.919165535" observedRunningTime="2026-01-21 17:04:59.606038643 +0000 UTC m=+6064.530528265" watchObservedRunningTime="2026-01-21 17:04:59.612918926 +0000 UTC m=+6064.537408548" Jan 21 17:05:02 crc kubenswrapper[4773]: I0121 17:05:02.374172 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:05:02 crc kubenswrapper[4773]: I0121 17:05:02.375335 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:05:02 crc kubenswrapper[4773]: I0121 17:05:02.441276 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:05:12 crc kubenswrapper[4773]: I0121 17:05:12.435752 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:05:12 crc kubenswrapper[4773]: I0121 17:05:12.490234 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8f9kw"] Jan 21 17:05:12 crc kubenswrapper[4773]: I0121 17:05:12.726241 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8f9kw" podUID="b7ca1119-312b-43c8-91b6-798ef6b132fd" containerName="registry-server" containerID="cri-o://30a811fb1970bf887d3b016865f7e73d709336b3f312d105ae0279d2b067f4d1" gracePeriod=2 Jan 21 17:05:13 crc kubenswrapper[4773]: I0121 17:05:13.741278 4773 generic.go:334] "Generic (PLEG): container finished" podID="b7ca1119-312b-43c8-91b6-798ef6b132fd" containerID="30a811fb1970bf887d3b016865f7e73d709336b3f312d105ae0279d2b067f4d1" exitCode=0 Jan 21 17:05:13 crc kubenswrapper[4773]: I0121 17:05:13.741551 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f9kw" event={"ID":"b7ca1119-312b-43c8-91b6-798ef6b132fd","Type":"ContainerDied","Data":"30a811fb1970bf887d3b016865f7e73d709336b3f312d105ae0279d2b067f4d1"} Jan 21 17:05:13 crc kubenswrapper[4773]: I0121 17:05:13.925303 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.007974 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-catalog-content\") pod \"b7ca1119-312b-43c8-91b6-798ef6b132fd\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.008096 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-utilities\") pod \"b7ca1119-312b-43c8-91b6-798ef6b132fd\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.008125 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkqnh\" (UniqueName: \"kubernetes.io/projected/b7ca1119-312b-43c8-91b6-798ef6b132fd-kube-api-access-jkqnh\") pod \"b7ca1119-312b-43c8-91b6-798ef6b132fd\" (UID: \"b7ca1119-312b-43c8-91b6-798ef6b132fd\") " Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.008974 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-utilities" (OuterVolumeSpecName: "utilities") pod "b7ca1119-312b-43c8-91b6-798ef6b132fd" (UID: "b7ca1119-312b-43c8-91b6-798ef6b132fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.015841 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7ca1119-312b-43c8-91b6-798ef6b132fd-kube-api-access-jkqnh" (OuterVolumeSpecName: "kube-api-access-jkqnh") pod "b7ca1119-312b-43c8-91b6-798ef6b132fd" (UID: "b7ca1119-312b-43c8-91b6-798ef6b132fd"). InnerVolumeSpecName "kube-api-access-jkqnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.082242 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7ca1119-312b-43c8-91b6-798ef6b132fd" (UID: "b7ca1119-312b-43c8-91b6-798ef6b132fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.111549 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.111584 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkqnh\" (UniqueName: \"kubernetes.io/projected/b7ca1119-312b-43c8-91b6-798ef6b132fd-kube-api-access-jkqnh\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.111596 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ca1119-312b-43c8-91b6-798ef6b132fd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.761665 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8f9kw" event={"ID":"b7ca1119-312b-43c8-91b6-798ef6b132fd","Type":"ContainerDied","Data":"7c3d3d46c0a05600c78ffe99baa9188d124bcd80ec654cf8aec496e546e41cb4"} Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.761734 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8f9kw" Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.761749 4773 scope.go:117] "RemoveContainer" containerID="30a811fb1970bf887d3b016865f7e73d709336b3f312d105ae0279d2b067f4d1" Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.792382 4773 scope.go:117] "RemoveContainer" containerID="bb8dfba8b71b0afb3eb13b1800d760146104a8b9df184f1f2effe836716d73f9" Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.807624 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8f9kw"] Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.810734 4773 scope.go:117] "RemoveContainer" containerID="9cba79d9e1aeb74b1f1d65a14966cefb2ebf805d89216583f700f7bf5c224228" Jan 21 17:05:14 crc kubenswrapper[4773]: I0121 17:05:14.820146 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8f9kw"] Jan 21 17:05:15 crc kubenswrapper[4773]: I0121 17:05:15.394800 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7ca1119-312b-43c8-91b6-798ef6b132fd" path="/var/lib/kubelet/pods/b7ca1119-312b-43c8-91b6-798ef6b132fd/volumes" Jan 21 17:06:55 crc kubenswrapper[4773]: I0121 17:06:55.205531 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:06:55 crc kubenswrapper[4773]: I0121 17:06:55.206157 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:07:25 crc kubenswrapper[4773]: I0121 17:07:25.205749 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:07:25 crc kubenswrapper[4773]: I0121 17:07:25.206294 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:07:55 crc kubenswrapper[4773]: I0121 17:07:55.206133 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:07:55 crc kubenswrapper[4773]: I0121 17:07:55.206779 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:07:55 crc kubenswrapper[4773]: I0121 17:07:55.206837 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 17:07:55 crc kubenswrapper[4773]: I0121 17:07:55.207757 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f58d85080097b79fe7e96de02a586d87703ae58f21ea5a84fd022f30d8e66c7f"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:07:55 crc kubenswrapper[4773]: I0121 17:07:55.207825 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://f58d85080097b79fe7e96de02a586d87703ae58f21ea5a84fd022f30d8e66c7f" gracePeriod=600 Jan 21 17:07:55 crc kubenswrapper[4773]: I0121 17:07:55.389617 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="f58d85080097b79fe7e96de02a586d87703ae58f21ea5a84fd022f30d8e66c7f" exitCode=0 Jan 21 17:07:55 crc kubenswrapper[4773]: I0121 17:07:55.402870 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"f58d85080097b79fe7e96de02a586d87703ae58f21ea5a84fd022f30d8e66c7f"} Jan 21 17:07:55 crc kubenswrapper[4773]: I0121 17:07:55.402937 4773 scope.go:117] "RemoveContainer" containerID="12b509ab56ee9664735ce7c3490a4ecb741eb32297ca758282ffd76e06362105" Jan 21 17:07:56 crc kubenswrapper[4773]: I0121 17:07:56.403355 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerStarted","Data":"e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb"} Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.283846 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mt42p"] Jan 21 17:08:18 crc kubenswrapper[4773]: E0121 17:08:18.284889 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ca1119-312b-43c8-91b6-798ef6b132fd" containerName="registry-server" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.284908 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ca1119-312b-43c8-91b6-798ef6b132fd" containerName="registry-server" Jan 21 17:08:18 crc kubenswrapper[4773]: E0121 17:08:18.284933 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ca1119-312b-43c8-91b6-798ef6b132fd" containerName="extract-content" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.284943 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ca1119-312b-43c8-91b6-798ef6b132fd" containerName="extract-content" Jan 21 17:08:18 crc kubenswrapper[4773]: E0121 17:08:18.284981 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ca1119-312b-43c8-91b6-798ef6b132fd" containerName="extract-utilities" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.284988 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ca1119-312b-43c8-91b6-798ef6b132fd" containerName="extract-utilities" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.285191 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7ca1119-312b-43c8-91b6-798ef6b132fd" containerName="registry-server" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.287505 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.301859 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mt42p"] Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.368852 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-utilities\") pod \"redhat-operators-mt42p\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.368936 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-catalog-content\") pod \"redhat-operators-mt42p\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.369062 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9xnv\" (UniqueName: \"kubernetes.io/projected/5f40a728-329a-4f46-90a4-01d6a038e0be-kube-api-access-z9xnv\") pod \"redhat-operators-mt42p\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.470888 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-utilities\") pod \"redhat-operators-mt42p\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.471415 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-utilities\") pod \"redhat-operators-mt42p\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.472741 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-catalog-content\") pod \"redhat-operators-mt42p\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.473078 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-catalog-content\") pod \"redhat-operators-mt42p\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.473225 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9xnv\" (UniqueName: \"kubernetes.io/projected/5f40a728-329a-4f46-90a4-01d6a038e0be-kube-api-access-z9xnv\") pod \"redhat-operators-mt42p\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.494793 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9xnv\" (UniqueName: \"kubernetes.io/projected/5f40a728-329a-4f46-90a4-01d6a038e0be-kube-api-access-z9xnv\") pod \"redhat-operators-mt42p\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:18 crc kubenswrapper[4773]: I0121 17:08:18.610325 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:19 crc kubenswrapper[4773]: I0121 17:08:19.151463 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mt42p"] Jan 21 17:08:19 crc kubenswrapper[4773]: I0121 17:08:19.633457 4773 generic.go:334] "Generic (PLEG): container finished" podID="5f40a728-329a-4f46-90a4-01d6a038e0be" containerID="b2d76fac6f0b8dafa64989fc603bf088790802c744b6f5ddef05d9d56febaa02" exitCode=0 Jan 21 17:08:19 crc kubenswrapper[4773]: I0121 17:08:19.633569 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt42p" event={"ID":"5f40a728-329a-4f46-90a4-01d6a038e0be","Type":"ContainerDied","Data":"b2d76fac6f0b8dafa64989fc603bf088790802c744b6f5ddef05d9d56febaa02"} Jan 21 17:08:19 crc kubenswrapper[4773]: I0121 17:08:19.633802 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt42p" event={"ID":"5f40a728-329a-4f46-90a4-01d6a038e0be","Type":"ContainerStarted","Data":"457fc77a1fb0a4e6c389df5244c5648aabbe6518667e11c93e26439f0444f280"} Jan 21 17:08:20 crc kubenswrapper[4773]: I0121 17:08:20.644390 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt42p" event={"ID":"5f40a728-329a-4f46-90a4-01d6a038e0be","Type":"ContainerStarted","Data":"a3ca6d2825091e13fb07d2341c7eb283ceaebcdecd16bc8a8518ab4c850054cc"} Jan 21 17:08:23 crc kubenswrapper[4773]: I0121 17:08:23.675862 4773 generic.go:334] "Generic (PLEG): container finished" podID="5f40a728-329a-4f46-90a4-01d6a038e0be" containerID="a3ca6d2825091e13fb07d2341c7eb283ceaebcdecd16bc8a8518ab4c850054cc" exitCode=0 Jan 21 17:08:23 crc kubenswrapper[4773]: I0121 17:08:23.676355 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt42p" event={"ID":"5f40a728-329a-4f46-90a4-01d6a038e0be","Type":"ContainerDied","Data":"a3ca6d2825091e13fb07d2341c7eb283ceaebcdecd16bc8a8518ab4c850054cc"} Jan 21 17:08:24 crc kubenswrapper[4773]: I0121 17:08:24.688581 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt42p" event={"ID":"5f40a728-329a-4f46-90a4-01d6a038e0be","Type":"ContainerStarted","Data":"fad9c27329e289a4b0df0d3bbedd15ef1a3f19293542b131d8518cce3b87cccf"} Jan 21 17:08:24 crc kubenswrapper[4773]: I0121 17:08:24.716769 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mt42p" podStartSLOduration=2.245325414 podStartE2EDuration="6.716746531s" podCreationTimestamp="2026-01-21 17:08:18 +0000 UTC" firstStartedPulling="2026-01-21 17:08:19.636587678 +0000 UTC m=+6264.561077310" lastFinishedPulling="2026-01-21 17:08:24.108008805 +0000 UTC m=+6269.032498427" observedRunningTime="2026-01-21 17:08:24.707566862 +0000 UTC m=+6269.632056484" watchObservedRunningTime="2026-01-21 17:08:24.716746531 +0000 UTC m=+6269.641236153" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.610604 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.611541 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.763158 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mqzvr"] Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.768328 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.791902 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mqzvr"] Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.804634 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvqtt\" (UniqueName: \"kubernetes.io/projected/b3c18d7e-ee89-4bc0-9466-4998836da067-kube-api-access-vvqtt\") pod \"community-operators-mqzvr\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.804874 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-catalog-content\") pod \"community-operators-mqzvr\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.804960 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-utilities\") pod \"community-operators-mqzvr\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.907150 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-utilities\") pod \"community-operators-mqzvr\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.907559 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-utilities\") pod \"community-operators-mqzvr\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.907677 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvqtt\" (UniqueName: \"kubernetes.io/projected/b3c18d7e-ee89-4bc0-9466-4998836da067-kube-api-access-vvqtt\") pod \"community-operators-mqzvr\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.908225 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-catalog-content\") pod \"community-operators-mqzvr\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.908551 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-catalog-content\") pod \"community-operators-mqzvr\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:28 crc kubenswrapper[4773]: I0121 17:08:28.927357 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvqtt\" (UniqueName: \"kubernetes.io/projected/b3c18d7e-ee89-4bc0-9466-4998836da067-kube-api-access-vvqtt\") pod \"community-operators-mqzvr\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:29 crc kubenswrapper[4773]: I0121 17:08:29.110261 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:29 crc kubenswrapper[4773]: I0121 17:08:29.665898 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mt42p" podUID="5f40a728-329a-4f46-90a4-01d6a038e0be" containerName="registry-server" probeResult="failure" output=< Jan 21 17:08:29 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 21 17:08:29 crc kubenswrapper[4773]: > Jan 21 17:08:29 crc kubenswrapper[4773]: I0121 17:08:29.756445 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mqzvr"] Jan 21 17:08:30 crc kubenswrapper[4773]: I0121 17:08:30.766573 4773 generic.go:334] "Generic (PLEG): container finished" podID="b3c18d7e-ee89-4bc0-9466-4998836da067" containerID="d3a1c7ed610b6baebd6571334ebc6a46b68e58c18f70f5db82b7814a1d5eadc8" exitCode=0 Jan 21 17:08:30 crc kubenswrapper[4773]: I0121 17:08:30.766982 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mqzvr" event={"ID":"b3c18d7e-ee89-4bc0-9466-4998836da067","Type":"ContainerDied","Data":"d3a1c7ed610b6baebd6571334ebc6a46b68e58c18f70f5db82b7814a1d5eadc8"} Jan 21 17:08:30 crc kubenswrapper[4773]: I0121 17:08:30.767016 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mqzvr" event={"ID":"b3c18d7e-ee89-4bc0-9466-4998836da067","Type":"ContainerStarted","Data":"af90094895b886f2041d452b5b61423bdebd00fa7414d570e61d33373d68cdc1"} Jan 21 17:08:31 crc kubenswrapper[4773]: I0121 17:08:31.779508 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mqzvr" event={"ID":"b3c18d7e-ee89-4bc0-9466-4998836da067","Type":"ContainerStarted","Data":"20b9baf1210a4829a15280c115de4d8e83c4ee716e5f27e2f028510cfd469b0f"} Jan 21 17:08:33 crc kubenswrapper[4773]: I0121 17:08:33.825982 4773 generic.go:334] "Generic (PLEG): container finished" podID="b3c18d7e-ee89-4bc0-9466-4998836da067" containerID="20b9baf1210a4829a15280c115de4d8e83c4ee716e5f27e2f028510cfd469b0f" exitCode=0 Jan 21 17:08:33 crc kubenswrapper[4773]: I0121 17:08:33.826376 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mqzvr" event={"ID":"b3c18d7e-ee89-4bc0-9466-4998836da067","Type":"ContainerDied","Data":"20b9baf1210a4829a15280c115de4d8e83c4ee716e5f27e2f028510cfd469b0f"} Jan 21 17:08:34 crc kubenswrapper[4773]: I0121 17:08:34.838057 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mqzvr" event={"ID":"b3c18d7e-ee89-4bc0-9466-4998836da067","Type":"ContainerStarted","Data":"62e581903a27402bb5edd22ed95fdbc26aa5397937aab962349b5f3b9ae2fb39"} Jan 21 17:08:34 crc kubenswrapper[4773]: I0121 17:08:34.867334 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mqzvr" podStartSLOduration=3.314699728 podStartE2EDuration="6.867313001s" podCreationTimestamp="2026-01-21 17:08:28 +0000 UTC" firstStartedPulling="2026-01-21 17:08:30.769052446 +0000 UTC m=+6275.693542068" lastFinishedPulling="2026-01-21 17:08:34.321665699 +0000 UTC m=+6279.246155341" observedRunningTime="2026-01-21 17:08:34.856913748 +0000 UTC m=+6279.781403380" watchObservedRunningTime="2026-01-21 17:08:34.867313001 +0000 UTC m=+6279.791802623" Jan 21 17:08:38 crc kubenswrapper[4773]: I0121 17:08:38.664481 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:38 crc kubenswrapper[4773]: I0121 17:08:38.709029 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:39 crc kubenswrapper[4773]: I0121 17:08:39.110998 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:39 crc kubenswrapper[4773]: I0121 17:08:39.111102 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:39 crc kubenswrapper[4773]: I0121 17:08:39.157730 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:39 crc kubenswrapper[4773]: I0121 17:08:39.927902 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:46 crc kubenswrapper[4773]: I0121 17:08:46.737272 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mqzvr"] Jan 21 17:08:46 crc kubenswrapper[4773]: I0121 17:08:46.737972 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mqzvr" podUID="b3c18d7e-ee89-4bc0-9466-4998836da067" containerName="registry-server" containerID="cri-o://62e581903a27402bb5edd22ed95fdbc26aa5397937aab962349b5f3b9ae2fb39" gracePeriod=2 Jan 21 17:08:46 crc kubenswrapper[4773]: I0121 17:08:46.949795 4773 generic.go:334] "Generic (PLEG): container finished" podID="b3c18d7e-ee89-4bc0-9466-4998836da067" containerID="62e581903a27402bb5edd22ed95fdbc26aa5397937aab962349b5f3b9ae2fb39" exitCode=0 Jan 21 17:08:46 crc kubenswrapper[4773]: I0121 17:08:46.949840 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mqzvr" event={"ID":"b3c18d7e-ee89-4bc0-9466-4998836da067","Type":"ContainerDied","Data":"62e581903a27402bb5edd22ed95fdbc26aa5397937aab962349b5f3b9ae2fb39"} Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.336599 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mt42p"] Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.337370 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mt42p" podUID="5f40a728-329a-4f46-90a4-01d6a038e0be" containerName="registry-server" containerID="cri-o://fad9c27329e289a4b0df0d3bbedd15ef1a3f19293542b131d8518cce3b87cccf" gracePeriod=2 Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.622204 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.759268 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvqtt\" (UniqueName: \"kubernetes.io/projected/b3c18d7e-ee89-4bc0-9466-4998836da067-kube-api-access-vvqtt\") pod \"b3c18d7e-ee89-4bc0-9466-4998836da067\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.759478 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-catalog-content\") pod \"b3c18d7e-ee89-4bc0-9466-4998836da067\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.759787 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-utilities\") pod \"b3c18d7e-ee89-4bc0-9466-4998836da067\" (UID: \"b3c18d7e-ee89-4bc0-9466-4998836da067\") " Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.761461 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-utilities" (OuterVolumeSpecName: "utilities") pod "b3c18d7e-ee89-4bc0-9466-4998836da067" (UID: "b3c18d7e-ee89-4bc0-9466-4998836da067"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.768964 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3c18d7e-ee89-4bc0-9466-4998836da067-kube-api-access-vvqtt" (OuterVolumeSpecName: "kube-api-access-vvqtt") pod "b3c18d7e-ee89-4bc0-9466-4998836da067" (UID: "b3c18d7e-ee89-4bc0-9466-4998836da067"). InnerVolumeSpecName "kube-api-access-vvqtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.847194 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3c18d7e-ee89-4bc0-9466-4998836da067" (UID: "b3c18d7e-ee89-4bc0-9466-4998836da067"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.862931 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvqtt\" (UniqueName: \"kubernetes.io/projected/b3c18d7e-ee89-4bc0-9466-4998836da067-kube-api-access-vvqtt\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.862991 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.863031 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3c18d7e-ee89-4bc0-9466-4998836da067-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.963664 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mqzvr" Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.963675 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mqzvr" event={"ID":"b3c18d7e-ee89-4bc0-9466-4998836da067","Type":"ContainerDied","Data":"af90094895b886f2041d452b5b61423bdebd00fa7414d570e61d33373d68cdc1"} Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.963815 4773 scope.go:117] "RemoveContainer" containerID="62e581903a27402bb5edd22ed95fdbc26aa5397937aab962349b5f3b9ae2fb39" Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.968369 4773 generic.go:334] "Generic (PLEG): container finished" podID="5f40a728-329a-4f46-90a4-01d6a038e0be" containerID="fad9c27329e289a4b0df0d3bbedd15ef1a3f19293542b131d8518cce3b87cccf" exitCode=0 Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.968444 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt42p" event={"ID":"5f40a728-329a-4f46-90a4-01d6a038e0be","Type":"ContainerDied","Data":"fad9c27329e289a4b0df0d3bbedd15ef1a3f19293542b131d8518cce3b87cccf"} Jan 21 17:08:47 crc kubenswrapper[4773]: I0121 17:08:47.988951 4773 scope.go:117] "RemoveContainer" containerID="20b9baf1210a4829a15280c115de4d8e83c4ee716e5f27e2f028510cfd469b0f" Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.123901 4773 scope.go:117] "RemoveContainer" containerID="d3a1c7ed610b6baebd6571334ebc6a46b68e58c18f70f5db82b7814a1d5eadc8" Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.166577 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.255502 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mqzvr"] Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.269677 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mqzvr"] Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.270146 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-utilities\") pod \"5f40a728-329a-4f46-90a4-01d6a038e0be\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.270461 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-catalog-content\") pod \"5f40a728-329a-4f46-90a4-01d6a038e0be\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.270530 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9xnv\" (UniqueName: \"kubernetes.io/projected/5f40a728-329a-4f46-90a4-01d6a038e0be-kube-api-access-z9xnv\") pod \"5f40a728-329a-4f46-90a4-01d6a038e0be\" (UID: \"5f40a728-329a-4f46-90a4-01d6a038e0be\") " Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.282782 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-utilities" (OuterVolumeSpecName: "utilities") pod "5f40a728-329a-4f46-90a4-01d6a038e0be" (UID: "5f40a728-329a-4f46-90a4-01d6a038e0be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.283037 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f40a728-329a-4f46-90a4-01d6a038e0be-kube-api-access-z9xnv" (OuterVolumeSpecName: "kube-api-access-z9xnv") pod "5f40a728-329a-4f46-90a4-01d6a038e0be" (UID: "5f40a728-329a-4f46-90a4-01d6a038e0be"). InnerVolumeSpecName "kube-api-access-z9xnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.373520 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9xnv\" (UniqueName: \"kubernetes.io/projected/5f40a728-329a-4f46-90a4-01d6a038e0be-kube-api-access-z9xnv\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.373557 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.385582 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f40a728-329a-4f46-90a4-01d6a038e0be" (UID: "5f40a728-329a-4f46-90a4-01d6a038e0be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.476035 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f40a728-329a-4f46-90a4-01d6a038e0be-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.982364 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt42p" event={"ID":"5f40a728-329a-4f46-90a4-01d6a038e0be","Type":"ContainerDied","Data":"457fc77a1fb0a4e6c389df5244c5648aabbe6518667e11c93e26439f0444f280"} Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.982427 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt42p" Jan 21 17:08:48 crc kubenswrapper[4773]: I0121 17:08:48.983604 4773 scope.go:117] "RemoveContainer" containerID="fad9c27329e289a4b0df0d3bbedd15ef1a3f19293542b131d8518cce3b87cccf" Jan 21 17:08:49 crc kubenswrapper[4773]: I0121 17:08:49.010085 4773 scope.go:117] "RemoveContainer" containerID="a3ca6d2825091e13fb07d2341c7eb283ceaebcdecd16bc8a8518ab4c850054cc" Jan 21 17:08:49 crc kubenswrapper[4773]: I0121 17:08:49.035502 4773 scope.go:117] "RemoveContainer" containerID="b2d76fac6f0b8dafa64989fc603bf088790802c744b6f5ddef05d9d56febaa02" Jan 21 17:08:49 crc kubenswrapper[4773]: I0121 17:08:49.048293 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mt42p"] Jan 21 17:08:49 crc kubenswrapper[4773]: I0121 17:08:49.060120 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mt42p"] Jan 21 17:08:49 crc kubenswrapper[4773]: I0121 17:08:49.397004 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f40a728-329a-4f46-90a4-01d6a038e0be" path="/var/lib/kubelet/pods/5f40a728-329a-4f46-90a4-01d6a038e0be/volumes" Jan 21 17:08:49 crc kubenswrapper[4773]: I0121 17:08:49.397972 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3c18d7e-ee89-4bc0-9466-4998836da067" path="/var/lib/kubelet/pods/b3c18d7e-ee89-4bc0-9466-4998836da067/volumes" Jan 21 17:09:55 crc kubenswrapper[4773]: I0121 17:09:55.206105 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:09:55 crc kubenswrapper[4773]: I0121 17:09:55.207110 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:10:25 crc kubenswrapper[4773]: I0121 17:10:25.205557 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:10:25 crc kubenswrapper[4773]: I0121 17:10:25.206077 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:10:55 crc kubenswrapper[4773]: I0121 17:10:55.205896 4773 patch_prober.go:28] interesting pod/machine-config-daemon-rfzvc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 17:10:55 crc kubenswrapper[4773]: I0121 17:10:55.206821 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 17:10:55 crc kubenswrapper[4773]: I0121 17:10:55.206898 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" Jan 21 17:10:55 crc kubenswrapper[4773]: I0121 17:10:55.207884 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb"} pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 17:10:55 crc kubenswrapper[4773]: I0121 17:10:55.207956 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerName="machine-config-daemon" containerID="cri-o://e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" gracePeriod=600 Jan 21 17:10:55 crc kubenswrapper[4773]: E0121 17:10:55.329206 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:10:56 crc kubenswrapper[4773]: I0121 17:10:56.199381 4773 generic.go:334] "Generic (PLEG): container finished" podID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" exitCode=0 Jan 21 17:10:56 crc kubenswrapper[4773]: I0121 17:10:56.199478 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" event={"ID":"dff586d5-9d98-4ec2-afb1-e550fd4f3678","Type":"ContainerDied","Data":"e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb"} Jan 21 17:10:56 crc kubenswrapper[4773]: I0121 17:10:56.199564 4773 scope.go:117] "RemoveContainer" containerID="f58d85080097b79fe7e96de02a586d87703ae58f21ea5a84fd022f30d8e66c7f" Jan 21 17:10:56 crc kubenswrapper[4773]: I0121 17:10:56.201243 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:10:56 crc kubenswrapper[4773]: E0121 17:10:56.201898 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:11:10 crc kubenswrapper[4773]: I0121 17:11:10.393633 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:11:10 crc kubenswrapper[4773]: E0121 17:11:10.394720 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:11:21 crc kubenswrapper[4773]: I0121 17:11:21.383566 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:11:21 crc kubenswrapper[4773]: E0121 17:11:21.384221 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:11:35 crc kubenswrapper[4773]: I0121 17:11:35.390277 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:11:35 crc kubenswrapper[4773]: E0121 17:11:35.390998 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:11:46 crc kubenswrapper[4773]: I0121 17:11:46.384504 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:11:46 crc kubenswrapper[4773]: E0121 17:11:46.386355 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:12:01 crc kubenswrapper[4773]: I0121 17:12:01.384525 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:12:01 crc kubenswrapper[4773]: E0121 17:12:01.385773 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:12:15 crc kubenswrapper[4773]: I0121 17:12:15.402493 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:12:15 crc kubenswrapper[4773]: E0121 17:12:15.404176 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:12:28 crc kubenswrapper[4773]: I0121 17:12:28.384581 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:12:28 crc kubenswrapper[4773]: E0121 17:12:28.385345 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:12:43 crc kubenswrapper[4773]: I0121 17:12:43.383738 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:12:43 crc kubenswrapper[4773]: E0121 17:12:43.384390 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:12:55 crc kubenswrapper[4773]: I0121 17:12:55.384914 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:12:55 crc kubenswrapper[4773]: E0121 17:12:55.385722 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:13:08 crc kubenswrapper[4773]: I0121 17:13:08.384339 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:13:08 crc kubenswrapper[4773]: E0121 17:13:08.385130 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" Jan 21 17:13:21 crc kubenswrapper[4773]: I0121 17:13:21.386277 4773 scope.go:117] "RemoveContainer" containerID="e9302a77208d5d7c17e9b69ef494009650320ab86d1d8610ac1d486315ea9ddb" Jan 21 17:13:21 crc kubenswrapper[4773]: E0121 17:13:21.387163 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rfzvc_openshift-machine-config-operator(dff586d5-9d98-4ec2-afb1-e550fd4f3678)\"" pod="openshift-machine-config-operator/machine-config-daemon-rfzvc" podUID="dff586d5-9d98-4ec2-afb1-e550fd4f3678" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515134204474024451 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015134204475017367 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015134167275016520 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015134167275015470 5ustar corecore